Sample records for method considerably reduces

  1. Structural Methods to Reduce Navigation Channel Shoaling

    DTIC Science & Technology

    2005-09-01

    to reduce shoaling in navigation channels,” Technical Note, Coastal and Hydraulics Laboratory, U.S. Army Engineer Research and Development Center...deal of interest in reducing or minimizing the quantity of dredging. Considerable research and extensive field experience have offered several methods...project. • Less frequent dredging usually reduces overall dredging costs. • The trap can be intentionally located close to dredged material disposal

  2. Operational considerations to reduce solar array loads

    NASA Technical Reports Server (NTRS)

    Gerstenmaier, W.

    1992-01-01

    The key parameters associated with solar array plume loads are examined, and operational considerations aimed at minimizing the effect of the Shuttle plumes on the Space Station solar arrays are discussed. These include solar array pointing to reduce loads and restrictions on Shuttle piloting. Particular attention is given to the method used to obtain the forcing functions (thruster time firing histories) for solar array plume calculation.

  3. Ultimate disposal of scrubber wastes

    NASA Technical Reports Server (NTRS)

    Cohenour, B. C.

    1978-01-01

    Part of the initial concern with using the wet scrubbers on the hypergolic propellants was the subsequential disposal of the liquid wastes. To do this, consideration was given to all possible methods to reduce the volume of the wastes and stay within the guidelines established by the state and federal environmental protection agencies. One method that was proposed was the use of water hyacinths in disposal ponds to reduce the waste concentration in the effluent to less than EPA tolerable levels. This method was under consideration and even in use by private industry, municipal governments, and NASA for upgrading existing wastewater treatment facilities to a tertiary system. The use of water hyacinths in disposal ponds appears to be a very cost-effective method for reduction and disposal of hypergolic propellants.

  4. Assessment of alternative disposal methods to reduce greenhouse gas emissions from municipal solid waste in India.

    PubMed

    Yedla, Sudhakar; Sindhu, N T

    2016-06-01

    Open dumping, the most commonly practiced method of solid waste disposal in Indian cities, creates serious environment and economic challenges, and also contributes significantly to greenhouse gas emissions. The present article attempts to analyse and identify economically effective ways to reduce greenhouse gas emissions from municipal solid waste. The article looks at the selection of appropriate methods for the control of methane emissions. Multivariate functional models are presented, based on theoretical considerations as well as the field measurements to forecast the greenhouse gas mitigation potential for all the methodologies under consideration. Economic feasibility is tested by calculating the unit cost of waste disposal for the respective disposal process. The purpose-built landfill system proposed by Yedla and Parikh has shown promise in controlling greenhouse gas and saving land. However, these studies show that aerobic composting offers the optimal method, both in terms of controlling greenhouse gas emissions and reducing costs, mainly by requiring less land than other methods. © The Author(s) 2016.

  5. Efficient Voronoi volume estimation for DEM simulations of granular materials under confined conditions

    PubMed Central

    Frenning, Göran

    2015-01-01

    When the discrete element method (DEM) is used to simulate confined compression of granular materials, the need arises to estimate the void space surrounding each particle with Voronoi polyhedra. This entails recurring Voronoi tessellation with small changes in the geometry, resulting in a considerable computational overhead. To overcome this limitation, we propose a method with the following features:•A local determination of the polyhedron volume is used, which considerably simplifies implementation of the method.•A linear approximation of the polyhedron volume is utilised, with intermittent exact volume calculations when needed.•The method allows highly accurate volume estimates to be obtained at a considerably reduced computational cost. PMID:26150975

  6. Preshaping command inputs to reduce telerobotic system oscillations

    NASA Technical Reports Server (NTRS)

    Singer, Neil C.; Seering, Warren P.

    1989-01-01

    The results of using a new technique for shaping inputs to a model of the space shuttle Remote Manipulator System (RMS) are presented. The shapes inputs move the system to the same location that was originally commanded, however, the oscillations of the machine are considerably reduced. An overview of the new shaping method is presented. A description of RMS model is provided. The problem of slow joint servo rates on the RMS is accommodated with an extension of the shaping method. The results and sample data are also presented for both joint and three-dimensional cartesian motions. The results demonstrate that the new shaping method performs well on large, telerobotic systems which exhibit significant structural vibration. The new method is shown to also result in considerable energy savings during operations of the RMS manipulator.

  7. Comparison of diverse methods for the correction of atmospheric effects on LANDSAT and SKYLAB images. [radiometric correction in Brazil

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Camara, G.; Dias, L. A. V.; Mascarenhas, N. D. D.; Desouza, R. C. M.; Pereira, A. E. C.

    1982-01-01

    Earth's atmosphere reduces a sensors ability in currently discriminating targets. Using radiometric correction to reduce the atmospheric effects may improve considerably the performance of an automatic image interpreter. Several methods for radiometric correction from the open literature are compared leading to the development of an atmospheric correction system.

  8. Non-woody weed control in pine plantations

    Treesearch

    Phillip M. Dougherty; Bob Lowery

    1986-01-01

    The cost and benefits derived from controlling non-woody competitors in pine planations were reviewed. Cost considerations included both the capital cost and biological cost that may be incurred when weed control treatments are applied. Several methods for reducing the cost of herbicide treatments were explored. Cost reduction considerations included adjustments in...

  9. Implementation of Testing Equipment for Asphalt Materials : Tech Summary

    DOT National Transportation Integrated Search

    2009-05-01

    Three new automated methods for related asphalt material and mixture testing were evaluated under this study. Each of these devices is designed to reduce testing time considerably and reduce operator error by automating the testing process. The Thery...

  10. Implementation of testing equipment for asphalt materials : tech summary.

    DOT National Transportation Integrated Search

    2009-05-01

    Three new automated methods for related asphalt material and mixture testing were evaluated : under this study. Each of these devices is designed to reduce testing time considerably and reduce : operator error by automating the testing process. The T...

  11. Reduced Order Modeling Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Helenbrook, B. T.

    2010-01-01

    The details: a) Need stable numerical methods; b) Round off error can be considerable; c) Not convinced modes are correct for incompressible flow. Nonetheless, can derive compact and accurate reduced-order models. Can be used to generate actuator models or full flow-field models

  12. Automatic Query Formulations in Information Retrieval.

    ERIC Educational Resources Information Center

    Salton, G.; And Others

    1983-01-01

    Introduces methods designed to reduce role of search intermediaries by generating Boolean search formulations automatically using term frequency considerations from natural language statements provided by system patrons. Experimental results are supplied and methods are described for applying automatic query formulation process in practice.…

  13. Critical considerations for the application of environmental DNA methods to detect aquatic species

    USGS Publications Warehouse

    Goldberg, Caren S.; Turner, Cameron R.; Deiner, Kristy; Klymus, Katy E.; Thomsen, Philip Francis; Murphy, Melanie A.; Spear, Stephen F.; McKee, Anna; Oyler-McCance, Sara J.; Cornman, Robert S.; Laramie, Matthew B.; Mahon, Andrew R.; Lance, Richard F.; Pilliod, David S.; Strickler, Katherine M.; Waits, Lisette P.; Fremier, Alexander K.; Takahara, Teruhiko; Herder, Jelger E.; Taberlet, Pierre

    2016-01-01

    Species detection using environmental DNA (eDNA) has tremendous potential for contributing to the understanding of the ecology and conservation of aquatic species. Detecting species using eDNA methods, rather than directly sampling the organisms, can reduce impacts on sensitive species and increase the power of field surveys for rare and elusive species. The sensitivity of eDNA methods, however, requires a heightened awareness and attention to quality assurance and quality control protocols. Additionally, the interpretation of eDNA data demands careful consideration of multiple factors. As eDNA methods have grown in application, diverse approaches have been implemented to address these issues. With interest in eDNA continuing to expand, supportive guidelines for undertaking eDNA studies are greatly needed.Environmental DNA researchers from around the world have collaborated to produce this set of guidelines and considerations for implementing eDNA methods to detect aquatic macroorganisms.Critical considerations for study design include preventing contamination in the field and the laboratory, choosing appropriate sample analysis methods, validating assays, testing for sample inhibition and following minimum reporting guidelines. Critical considerations for inference include temporal and spatial processes, limits of correlation of eDNA with abundance, uncertainty of positive and negative results, and potential sources of allochthonous DNA.We present a synthesis of knowledge at this stage for application of this new and powerful detection method.

  14. Comparative Study of Drift Compensation Methods for Environmental Gas Sensors

    NASA Astrophysics Data System (ADS)

    Abidin, M. Z.; Asmat, Arnis; Hamidon, M. N.

    2018-02-01

    Most drift compensation attempts in environmental gas sensors are only emphasize on the “already-known” drift-causing parameter (i.e., ambient temperature, relative humidity) in compensating the sensor drift. Less consideration is taken to another parameter (i.e., baseline responses) that might have affected indirectly with the promotion of drift-causing parameter variable (in this context, is ambient temperature variable). In this study, the “indirect” drift-causing parameter (drifted baseline responses) has been taken into consideration in compensating the sensor drift caused by ambient temperature variable, by means of a proposed drift compensation method (named as RT-method). The effectiveness of this method in its efficacy of compensating drift was analysed and compared with the common method that used the “already-known” drift-causing parameter (named as T-method), using drift reduction percentage. From the results analysis, the RT-method has outperformed T- method in the drift reduction percentage, with its ability to reduce drift up to 64% rather than the T-method which only able to reduce up to 45% for TGS2600 sensor. It has proven that the inclusion of drifted baseline responses into drift compensation attempt would resulted to an improved drift compensation efficiency.

  15. Reducing Student Resistance to Active Learning: Strategies for Instructors

    ERIC Educational Resources Information Center

    Finelli, Cynthia J.; Nguyen, Kevin; DeMonbrun, Matthew; Borrego, Maura; Prince, Michael; Husman, Jennifer; Henderson, Charles; Shekhar, Prateek; Waters, Cynthia K.

    2018-01-01

    In spite of considerable evidence of the effectiveness of active learning and other contemporary teaching methods, barriers to adoption of those methods, such as possible student resistance, continue to exist. This study addresses student resistance by analyzing data from 1,051 students who completed our Student Response to Instructional Practices…

  16. Methods of alleviation of ionospheric scintillation effects on digital communications

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1974-01-01

    The degradation of the performance of digital communication systems because of ionospheric scintillation effects can be reduced either by diversity techniques or by coding. The effectiveness of traditional space-diversity, frequency-diversity and time-diversity techniques is reviewed and design considerations isolated. Time-diversity signaling is then treated as an extremely simple form of coding. More advanced coding methods, such as diffuse threshold decoding and burst-trapping decoding, which appear attractive in combatting scintillation effects are discussed and design considerations noted. Finally, adaptive coding techniques appropriate when the general state of the channel is known are discussed.

  17. Minimizing Dispersion in FDTD Methods with CFL Limit Extension

    NASA Astrophysics Data System (ADS)

    Sun, Chen

    The CFL extension in FDTD methods is receiving considerable attention in order to reduce the computational effort and save the simulation time. One of the major issues in the CFL extension methods is the increased dispersion. We formulate a decomposition of FDTD equations to study the behaviour of the dispersion. A compensation scheme to reduce the dispersion in CFL extension is constructed and proposed. We further study the CFL extension in a FDTD subgridding case, where we improve the accuracy by acting only on the FDTD equations of the fine grid. Numerical results confirm the efficiency of the proposed method for minimising dispersion.

  18. Testing the effectiveness of certainty scales, cheap talk, and dissonance-minimization in reducing hypothetical bias in contingent valuation studies

    Treesearch

    Mark Morrison; Thomas C. Brown

    2009-01-01

    Stated preference methods such as contingent valuation and choice modeling are subject to various biases that may lead to differences between actual and hypothetical willingness to pay. Cheap talk, follow-up certainty scales, and dissonance minimization are three techniques for reducing this hypothetical bias. Cheap talk and certainty scales have received considerable...

  19. A Conditional Exposure Control Method for Multidimensional Adaptive Testing

    ERIC Educational Resources Information Center

    Finkelman, Matthew; Nering, Michael L.; Roussos, Louis A.

    2009-01-01

    In computerized adaptive testing (CAT), ensuring the security of test items is a crucial practical consideration. A common approach to reducing item theft is to define maximum item exposure rates, i.e., to limit the proportion of examinees to whom a given item can be administered. Numerous methods for controlling exposure rates have been proposed…

  20. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  1. Characteristics linked to the reduction of stigma towards schizophrenia: a pre-and-post study of parents of adolescents attending an educational program

    PubMed Central

    2014-01-01

    Background The stigma of schizophrenia constitutes a major barrier to early detection and treatment of this illness. Anti-stigma education has been welcomed to reduce stigma among the general public. This study examined the factors associated with the effectiveness of a web-based educational program designed to reduce the stigma associated with schizophrenia. Methods Using Link’s Devaluation-Discrimination Scale to measure stigma, the effect of the program was measured by the difference in pre- and post-program tests. In the present study, we focused on program participants whose stigma towards schizophrenia had considerably improved (a reduction of three points or more between pre- and post-program tests) or considerably worsened (an increase of three points or more). The study participants were 1,058 parents of middle or high school students across Japan, including 508 whose stigma had significantly decreased after the program and 550 whose stigma had significantly increased. We used multiple logistic regression analysis to predict a considerable reduction in stigma (by three or more points) using independent variables measured before exposure to the program. In these models, we assessed the effects of demographic characteristics of the participants and four measures of knowledge and views on schizophrenia (basic knowledge, Link’s Devaluation-Discrimination Scale, ability to distinguish schizophrenia from other conditions, and social distance). Results Participants’ employment status, occupation, basic knowledge of schizophrenia, pre-program Link’s Devaluation-Discrimination Scale score, and social distance were significant factors associated with a considerable decrease in the stigma attached to schizophrenia following the educational program. Specifically, full-time and part-time employees were more likely to experience reduced stigma than parents who were self-employed, unemployed, or had other employment status. Considerable decreases in stigma were more likely among parents working in transportation and communication or as homemakers than among other occupational groups. In addition, parents with higher pre-program levels of stigma, lower basic knowledge, or lower social distance were more likely to have reduced levels of stigma. Conclusions Based on the regression analysis results presented here, several possible methods of reducing stigma were suggested, including increasing personal contact with people with schizophrenia and the improvement of law and insurance systems in primary and secondary industries. PMID:24642069

  2. Trying to prevent abortion.

    PubMed

    Bromham, D R; Oloto, E J

    1997-06-01

    It is known that, since antiquity, women confronted with an unwanted pregnancy have used abortion as a means of resolving their dilemma. Although undoubtedly widely used in all historical ages, abortion has come to be regarded as an event preferably avoided because of the impact on the women concerned as well as considerations for fetal life. Policies to reduce numbers and rates of abortion must acknowledge certain observations. Criminalization does not prevent abortion but increases maternal risks. A society's 'openness' in discussing sexual matters inversely correlates with abortion rates. Correlation between contraceptive use and abortion is also inverse but relates most closely to the efficacy of contraceptive methods used. 'Revolution' in the range of contraceptive methods used will have an equivalent impact on abortion rates. Secondary or emergency contraceptive methods have a considerable role to play in the reduction of abortion numbers. Good sex (and 'relationships') education programs may delay sexual debut, increase contraceptive usage and be associated with reduced abortion. Finally, interaction between socioeconomic factors and the choice between abortion and ongoing pregnancy are complex. Abortion is not necessarily chosen by those least able to support a child financially.

  3. Effects of elevated temperature on growth and reproduction of biofuels crops

    EPA Science Inventory

    Background/Questions/Methods Cellulosic biofuels crops have considerable potential to reduce our carbon footprint , and to be at least neutral in terms of carbon production. However, their widespread cultivation may result in unintended ecological and health effects. We report...

  4. PROCESS SIMULATION TOOLS FOR POLLUTION PREVENTION: NEW METHODS REDUCE THE MAGNITUDE OF WASTE STREAMS

    EPA Science Inventory

    Growing environmental concerns have spurred considerable interest in pollution prevention. In most instances, pollution prevention involves introducing radical changes to the design of processes so that waste generation is minimized. Process simulators can be effective tools in a...

  5. Development of Interior Permanent Magnet Motors with Concentrated Windings for Reducing Magnet Eddy Current Loss

    NASA Astrophysics Data System (ADS)

    Yamazaki, Katsumi; Kanou, Yuji; Fukushima, Yu; Ohki, Shunji; Nezu, Akira; Ikemi, Takeshi; Mizokami, Ryoichi

    In this paper, we present the development of interior magnet motors with concentrated windings, which reduce the eddy current loss of the magnets. First, the mechanism of the magnet eddy current loss generation is investigated by a simple linear magnetic circuit. Due to the consideration, an automatic optimization method using an adaptive finite element method is carried out to determine the stator and rotor shapes, which decrease the eddy current loss of the magnet. The determined stator and rotor are manufactured in order to proof the effectiveness by the measurement.

  6. Consideration of health inequalities in systematic reviews: a mapping review of guidance.

    PubMed

    Maden, Michelle

    2016-11-28

    Given that we know that interventions shown to be effective in improving the health of a population may actually widen the health inequalities gap while others reduce it, it is imperative that all systematic reviewers consider how the findings of their reviews may impact (reduce or increase) on the health inequality gap. This study reviewed existing guidance on incorporating considerations of health inequalities in systematic reviews in order to examine the extent to which they can help reviewers to incorporate such issues. A mapping review was undertaken to identify guidance documents that purported to inform reviewers on whether and how to incorporate considerations of health inequalities. Searches were undertaken in Medline, CINAHL and The Cochrane Library Methodology Register. Review guidance manuals prepared by international organisations engaged in undertaking systematic reviews, and their associated websites were scanned. Studies were included if they provided an overview or discussed the development and testing of guidance for dealing with the incorporation of considerations of health inequalities in evidence synthesis. Results are summarised in narrative and tabular forms. Twenty guidance documents published between 2009 and 2016 were included. Guidance has been produced to inform considerations of health inequalities at different stages of the systematic review process. The Campbell and Cochrane Equity Group have been instrumental in developing and promoting such guidance. Definitions of health inequalities and guidance differed across the included studies. All but one guidance document were transparent in their method of production. Formal methods of evaluation were reported for six guidance documents. Most of the guidance was operationalised in the form of examples taken from published systematic reviews. The number of guidance items to operationalise ranges from 3 up to 26 with a considerable overlap noted. Adhering to the guidance will require more work for the reviewers. It requires a deeper understanding of how reviewers can operationalise the guidance taking into consideration the barriers and facilitators involved. This has implications not only for understanding the usefulness and burden of the guidance but also for the uptake of guidance and its ultimate goal of improving health inequalities considerations in systematic reviews.

  7. Predicting protein contact map using evolutionary and physical constraints by integer programming.

    PubMed

    Wang, Zhiyong; Xu, Jinbo

    2013-07-01

    Protein contact map describes the pairwise spatial and functional relationship of residues in a protein and contains key information for protein 3D structure prediction. Although studied extensively, it remains challenging to predict contact map using only sequence information. Most existing methods predict the contact map matrix element-by-element, ignoring correlation among contacts and physical feasibility of the whole-contact map. A couple of recent methods predict contact map by using mutual information, taking into consideration contact correlation and enforcing a sparsity restraint, but these methods demand for a very large number of sequence homologs for the protein under consideration and the resultant contact map may be still physically infeasible. This article presents a novel method PhyCMAP for contact map prediction, integrating both evolutionary and physical restraints by machine learning and integer linear programming. The evolutionary restraints are much more informative than mutual information, and the physical restraints specify more concrete relationship among contacts than the sparsity restraint. As such, our method greatly reduces the solution space of the contact map matrix and, thus, significantly improves prediction accuracy. Experimental results confirm that PhyCMAP outperforms currently popular methods no matter how many sequence homologs are available for the protein under consideration. http://raptorx.uchicago.edu.

  8. Optimal selection of biochars for remediating metals contaminated mine soils

    EPA Science Inventory

    Approximately 500,000 abandoned mines across the U.S. pose a considerable, pervasive risk to human health and the environment due to possible exposure to the residuals of heavy metal extraction. Historically, a variety of chemical and biological methods have been used to reduce ...

  9. Effect of sampling rate and record length on the determination of stability and control derivatives

    NASA Technical Reports Server (NTRS)

    Brenner, M. J.; Iliff, K. W.; Whitman, R. K.

    1978-01-01

    Flight data from five aircraft were used to assess the effects of sampling rate and record length reductions on estimates of stability and control derivatives produced by a maximum likelihood estimation method. Derivatives could be extracted from flight data with the maximum likelihood estimation method even if there were considerable reductions in sampling rate and/or record length. Small amplitude pulse maneuvers showed greater degradation of the derivative maneuvers than large amplitude pulse maneuvers when these reductions were made. Reducing the sampling rate was found to be more desirable than reducing the record length as a method of lessening the total computation time required without greatly degrading the quantity of the estimates.

  10. Hedged Monte-Carlo: low variance derivative pricing with objective probabilities

    NASA Astrophysics Data System (ADS)

    Potters, Marc; Bouchaud, Jean-Philippe; Sestovic, Dragan

    2001-01-01

    We propose a new ‘hedged’ Monte-Carlo ( HMC) method to price financial derivatives, which allows to determine simultaneously the optimal hedge. The inclusion of the optimal hedging strategy allows one to reduce the financial risk associated with option trading, and for the very same reason reduces considerably the variance of our HMC scheme as compared to previous methods. The explicit accounting of the hedging cost naturally converts the objective probability into the ‘risk-neutral’ one. This allows a consistent use of purely historical time series to price derivatives and obtain their residual risk. The method can be used to price a large class of exotic options, including those with path dependent and early exercise features.

  11. Algorithms and Object-Oriented Software for Distributed Physics-Based Modeling

    NASA Technical Reports Server (NTRS)

    Kenton, Marc A.

    2001-01-01

    The project seeks to develop methods to more efficiently simulate aerospace vehicles. The goals are to reduce model development time, increase accuracy (e.g.,by allowing the integration of multidisciplinary models), facilitate collaboration by geographically- distributed groups of engineers, support uncertainty analysis and optimization, reduce hardware costs, and increase execution speeds. These problems are the subject of considerable contemporary research (e.g., Biedron et al. 1999; Heath and Dick, 2000).

  12. The role of anthropometry in designing for sustainability.

    PubMed

    Nadadur, Gopal; Parkinson, Matthew B

    2013-01-01

    An understanding of human factors and ergonomics facilitates the design of artefacts, tasks and environments that fulfil their users' physical and cognitive requirements. Research in these fields furthers the goal of efficiently accommodating the desired percentage of user populations through enhanced awareness and modelling of human variability. Design for sustainability (DfS) allows for these concepts to be leveraged in the broader context of designing to minimise negative impacts on the environment. This paper focuses on anthropometry and proposes three ways in which its consideration is relevant to DfS: reducing raw material consumption, increasing usage lifetimes and ethical human resource considerations. This is demonstrated through the application of anthropometry synthesis, virtual fitting, and sizing and adjustability allocation methods in the design of an industrial workstation seat for use in five distinct global populations. This work highlights the importance of and opportunities for using ergonomic design principles in DfS efforts. This research demonstrates the relevance of some anthropometry-based ergonomics concepts to the field of design for sustainability. A global design case study leverages human variability considerations in furthering three sustainable design goals: reducing raw material consumption, increasing usage lifetimes and incorporating ethical human resource considerations in design.

  13. A preprocessing strategy for helioseismic inversions

    NASA Astrophysics Data System (ADS)

    Christensen-Dalsgaard, J.; Thompson, M. J.

    1993-05-01

    Helioseismic inversion in general involves considerable computational expense, due to the large number of modes that is typically considered. This is true in particular of the widely used optimally localized averages (OLA) inversion methods, which require the inversion of one or more matrices whose order is the number of modes in the set. However, the number of practically independent pieces of information that a large helioseismic mode set contains is very much less than the number of modes, suggesting that the set might first be reduced before the expensive inversion is performed. We demonstrate with a model problem that by first performing a singular value decomposition the original problem may be transformed into a much smaller one, reducing considerably the cost of the OLA inversion and with no significant loss of information.

  14. MOLECULAR EVALUATION OF CHANGES IN PLANKTONIC BACTERIAL POPULATIONS RESULTING FROM EQUINE FECAL CONTAMINATION IN A SUB-WATERSHED

    EPA Science Inventory

    Considerable emphasis has been placed on developing watershed-based strategies with the potential to reduce non-point-source fecal contamination. Molecular methods applied used 16S-ribosomal-deoxyribonucleic-acid (rDNA) to try to determine sources of fecal contamination. Objectiv...

  15. A Method of Determining Where to Target Surveillance Efforts in Heterogeneous Epidemiological Systems

    USDA-ARS?s Scientific Manuscript database

    The spread of pathogens into new environments poses a considerable threat to health, ecosystems, and agricultural productivity worldwide. Early detection through effective surveillance is a key strategy to reduce the risk of their establishment. Whilst it is well established that statistical and eco...

  16. Critical Evaluation of Animal Alternative Tests for the Identification of Endocrine Active Substances, oral presentation

    EPA Science Inventory

    In the past 20 years, considerable progress in animal alternatives accompanied by advances in the toxicological sciences and new emphases on aquatic vertebrates has appeared. A significant amount of current research is targeted to evaluate alternative test methods that may reduce...

  17. Leukocyte-reduced blood components: patient benefits and practical applications.

    PubMed

    Higgins, V L

    1996-05-01

    To review the various types of filters used for red blood cell and platelet transfusions and to explain the trend in the use of leukocyte removal filters, practical information about their use, considerations in the selection of a filtration method, and cost-effectiveness issues. Published articles, books, and the author's experience. Leukocyte removal filters are used to reduce complications associated with transfused white blood cells that are contained in units of red blood cells and platelets. These complications include nonhemolytic febrile transfusion reactions (NHFTRs), alloimmunization and refractoriness to platelet transfusion, transfusion-transmitted cytomegalovirus (CMV), and immunomodulation. Leukocyte removal filters may be used at the bedside, in a hospital blood bank, or in a blood collection center. Factors that affect the flow rate of these filters include the variations in the blood component, the equipment used, and filter priming. Studies on the cost-effectiveness of using leukocyte-reduced blood components demonstrate savings based on the reduction of NHFTRs, reduction in the number of blood components used, and the use of filtered blood components as the equivalent of CMV seronegative-screened products. The use of leukocyte-reduced blood components significantly diminishes or prevents many of the adverse transfusion reactions associated with donor white blood cells. Leukocyte removal filters are cost-effective, and filters should be selected based on their ability to consistently achieve low leukocyte residual levels as well as their ease of use. Physicians may order leukocyte-reduced blood components for specific patients, or the components may be used because of an established institutional transfusion policy. Nurses often participate in deciding on a filtration method, primarily based on ease of use. Understanding the considerations in selecting a filtration method will help nurses make appropriate decisions to ensure quality patient care.

  18. Diameter Distributions and Basal Area of Pines and Hardwoods 12 Years Following Vairous Methods and Intensities of Site Preparation in the Georgia Peidmont

    Treesearch

    Timothy B. Harrington; M. Boyd Edwards

    1997-01-01

    Twelve years after various methods and intensities of site preparation in the Georgia Piedmont, diameter distributions and basal area (BA) of pines and hardwoods varied considerably among treatments. Site preparation reduced hardwood basal area to 36 percent of that observed in clearcut-only plots. As a result, planted-pine BA in the presence of site preparation was 2...

  19. ON THE CALIBRATION OF DK-02 AND KID DOSIMETERS (in Estonian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehvaert, H.

    1963-01-01

    For the periodic calibration of the DK-02 and WD dosimeters, the rotating stand method which is more advantageous than the usual method is recommended. The calibration can be accomplished in a strong gamma field, reducing considerably the time necessary for calibration. Using a point source, the dose becomes a simple function of time and geometrical parameters. The experimental values are in good agreement with theoretical values. (tr-auth)

  20. A Unified Approach to Measurement Error and Missing Data: Overview and Applications

    ERIC Educational Resources Information Center

    Blackwell, Matthew; Honaker, James; King, Gary

    2017-01-01

    Although social scientists devote considerable effort to mitigating measurement error during data collection, they often ignore the issue during data analysis. And although many statistical methods have been proposed for reducing measurement error-induced biases, few have been widely used because of implausible assumptions, high levels of model…

  1. Evaluation of two outlier-detection-based methods for detecting tissue-selective genes from microarray data.

    PubMed

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-05-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent's non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent's method is not suitable for ROKU.

  2. Method of euthanasia affects amygdala plasticity in horizontal brain slices from mice.

    PubMed

    Kulisch, C; Eckers, N; Albrecht, D

    2011-10-15

    An important consideration in any terminal experiment is the method used for euthanizing animals. Although the prime consideration is that the method is humane, some methods can have a dramatic impact on experimental outcomes. The standard inhalant anesthetic for experiments in brain slices is isoflurane, which replaced the flammable ethers used in the pioneer days of surgery. To our knowledge, there are no data available evaluating the effects of the method of euthanasia on plasticity changes in brain slices. Here, we compare the magnitude of long-term potentiation (LTP) and long-term depression (LTD) in the lateral nucleus of the amygdala (LA) after euthanasia following either ether or isoflurane anesthesia, as well as in mice decapitated without anesthesia. We found no differences in input-output curves using different methods of euthanasia. The LTP magnitude did not differ between ether and normal isoflurane anesthesia. After deep isoflurane anesthesia LTP induced by high frequency stimulation of cortical or intranuclear afferents was significantly reduced compared to ether anesthesia. In contrast to ether anesthesia and decapitation without anesthesia, the low frequency stimulation of cortical afferents induced a reliable LA-LTD after deep isoflurane anesthesia. Low frequency stimulation of intranuclear afferents only caused LTD after pretreatment with ether anesthesia. The results demonstrate that the method of euthanasia can influence brain plasticity for hours at least in the interface chamber. Therefore, the method of euthanasia is an important consideration when brain plasticity will be evaluated. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Printed Arabic optical character segmentation

    NASA Astrophysics Data System (ADS)

    Mohammad, Khader; Ayyesh, Muna; Qaroush, Aziz; Tumar, Iyad

    2015-03-01

    A considerable progress in recognition techniques for many non-Arabic characters has been achieved. In contrary, few efforts have been put on the research of Arabic characters. In any Optical Character Recognition (OCR) system the segmentation step is usually the essential stage in which an extensive portion of processing is devoted and a considerable share of recognition errors is attributed. In this research, a novel segmentation approach for machine Arabic printed text with diacritics is proposed. The proposed method reduces computation, errors, gives a clear description for the sub-word and has advantages over using the skeleton approach in which the data and information of the character can be lost. Both of initial evaluation and testing of the proposed method have been developed using MATLAB and shows 98.7% promising results.

  4. Chemoenzymatic method for glycomics: isolation, identification, and quantitation

    PubMed Central

    Yang, Shuang; Rubin, Abigail; Eshghi, Shadi Toghi; Zhang, Hui

    2015-01-01

    Over the past decade, considerable progress has been made with respect to the analytical methods for analysis of glycans from biological sources. Regardless of the specific methods that are used, glycan analysis includes isolation, identification, and quantitation. Derivatization is indispensable to increase their identification. Derivatization of glycans can be performed by permethylation or carbodiimide coupling / esterification. By introducing a fluorophore or chromophore at their reducing end, glycans can be separated by electrophoresis or chromatography. The fluorogenically labeled glycans can be quantitated using fluorescent detection. The recently developed approaches using solid-phase such as glycoprotein immobilization for glycan extraction and on-tissue glycan mass spectrometry imaging demonstrate advantages over methods performed in solution. Derivatization of sialic acids is favorably implemented on the solid support using carbodiimide coupling, and the released glycans can be further modified at the reducing end or permethylated for quantitative analysis. In this review, methods for glycan isolation, identification, and quantitation are discussed. PMID:26390280

  5. Experimental study of the possibility of reducing the resistance and unevenness of output field of velocities in flat diffuser channels with large opening angles

    NASA Astrophysics Data System (ADS)

    Dmitriev, S. S.; Vasil'ev, K. E.; Mokhamed, S. M. S. O.; Gusev, A. A.; Barbashin, A. V.

    2017-11-01

    In modern combined cycle gas turbines (CCGT), when designing the reducers from the output diffuser of a gas turbine to a boiler-utilizer, wide-angle diffusers are used, in which practically from the input a flow separation and transition to jet stream regime occurs. In such channels, the energy loss in the field of velocities sharply rise and the field of velocities in the output from them is characterized by considerable unevenness that worsens the heat transfer process in the first by motion tube bundles of the boiler-utilizer. The results of experimental research of the method for reducing the energy loss and alignment of the field of velocities at the output from a flat asymmetrical diffuser channel with one deflecting wall with the opening angle of 40° by means of placing inside the channel the flat plate parallel to the deflecting wall are presented in the paper. It is revealed that, at this placement of the plate in the channel, it has a chance to reduce the energy loss by 20%, considerably align the output field of velocities, and decrease the dynamic loads on the walls in the output cross-section. The studied method of resistance reduction and alignment of the fields of velocities in the flat diffuser channels was used for optimization of the reducer from the output diffuser of the gas turbine to the boiler-utilizer of CCGT of PGU-450T type of Kaliningrad Thermal Power Plant-2. The obtained results are evidence that the configuration of the reducer installed in the PGU-450T of Kaliningrad Thermal Power Plant-2 is not optimal. It follows also from the obtained data that working-off the reducer should be necessarily conducted by the test results of the channel consisting of the model of reducer with the model of boiler-utilizer installed behind it. Application of the method of alignment of output field of velocities and reducing the resistance in the wide-angle diffusers investigated in the work made it possible—when using the known model of diffusion reducer for PGU-450T, which is bad from the standpoint of aerodynamics— to reduce the value of the coefficient of the total loss by almost 20% as compared with the model of real reducer of PGU-450T.

  6. The Effectiveness of Art Therapy in Reducing Internalizing and Externalizing Problems of Female Adolescents.

    PubMed

    Bazargan, Yasaman; Pakdaman, Shahla

    2016-01-01

    The internalizing and externalizing problems relating to childhood and adolescent have always been significant. Because there is special considerations in establishing communication with them and hence, the therapeutic methods for these problems must take into account these considerations. As establishing a therapeutic relationship is an important component of effective counseling, it seems that art therapy may help alleviate these problems. The purpose of this study is to determine the effectiveness of art therapy in reducing internalizing and externalizing problems of adolescent girls (14 - 18 years old). This is a semi-experimental study carried out in the form of a pre-test/post-test design with control group. The population of this study includes female students of Gole Laleh School of Art in district 3 of Tehran, Iran, out of which 30 students with internalizing problems and 30 individuals with externalizing problems were selected through targeted sampling. Students were randomly assigned to control and experimental groups. Experimental groups participated in 6 painting sessions designed based on Art therapy theories and previous studies. The material used for diagnosis of the problems in posttest and pretest was an Achenbach self-assessment form. Data were analyzed using a mixed analysis of variance (ANOVA). Our results showed that Art therapy significantly reduced internalizing problems (F = 17.61, P < 0.001); however, its effect in reducing externalizing problems was not significant (F = 3.93, P = 0.06). Art therapy as a practical therapeutic method can be used to improve internalizing problems. To reduce externalizing problems, more sessions may be needed. Thus, future studies are required to insure these findings.

  7. Systematic generation of multibody equations of motion suitable for recursive and parallel manipulation

    NASA Technical Reports Server (NTRS)

    Nikravesh, Parviz E.; Gim, Gwanghum; Arabyan, Ara; Rein, Udo

    1989-01-01

    The formulation of a method known as the joint coordinate method for automatic generation of the equations of motion for multibody systems is summarized. For systems containing open or closed kinematic loops, the equations of motion can be reduced systematically to a minimum number of second order differential equations. The application of recursive and nonrecursive algorithms to this formulation, computational considerations and the feasibility of implementing this formulation on multiprocessor computers are discussed.

  8. Diagram reduction in problem of critical dynamics of ferromagnets: 4-loop approximation

    NASA Astrophysics Data System (ADS)

    Adzhemyan, L. Ts; Ivanova, E. V.; Kompaniets, M. V.; Vorobyeva, S. Ye

    2018-04-01

    Within the framework of the renormalization group approach to the models of critical dynamics, we propose a method for a considerable reduction of the number of integrals needed to calculate the critical exponents. With this method we perform a calculation of the critical exponent z of model A at 4-loop level, where our method allows one to reduce the number of integrals from 66 to 17. The way of constructing the integrand in a Feynman representation of such diagrams is discussed. Integrals were estimated numerically with a sector decomposition technique.

  9. The Role of Socioeconomic Status and Health Care Access in Breast Cancer Screening Compliance Among Hispanics.

    PubMed

    Jadav, Smruti; Rajan, Suja S; Abughosh, Susan; Sansgiry, Sujit S

    2015-01-01

    Considerable disparities in breast cancer screening exist between Hispanic and non-Hispanic white (NHW) women. Identifying and quantifying the factors contributing to these racial-ethnic disparities can help shape interventions and policies aimed at reducing these disparities. This study, for the first time, identified and quantified individual-level sociodemographic and health-related factors that contribute to racial-ethnic disparities in breast cancer screening using the nonlinear Blinder-Oaxaca decomposition method. Analysis of the retrospective pooled cross-sectional Medical Expenditure Panel Survey data from 2000 to 2010 was conducted. Women aged 40 years and older were included in the study. Logistic regressions were used to estimate racial-ethnic disparities in breast cancer screening. Nonlinear Blinder-Oaxaca decomposition method was used to identify and quantify the contribution of each individual-level factor toward racial-ethnic disparities. Based on the unadjusted analyses, Hispanic women had lower odds of receiving mammogram screening (MS) (odds ratio [OR]: 0.74; 95% confidence interval [CI]: 0.69-0.80) and breast cancer screening (OR: 0.75; 95% CI: 0.70-0.81) as compared with NHW women. However, the relationship reversed in adjusted analyses, such that Hispanic women had higher odds of receiving MS (OR: 1.27; 95% CI: 1.16-1.40) and breast cancer screening (OR: 1.28; 95% CI: 1.17-1.40) as compared with NHW women. The Blinder-Oaxaca decomposition estimated that improving insurance status, access to care, education, and income will considerably increase screening rates among Hispanic women. The study projects that improving health care access and health education will considerably increase breast cancer screening compliance among Hispanic women. Policies like the Affordable Care Act, and patient navigation and health education interventions, might considerably reduce screening disparities in the Hispanic population.

  10. Objectives and considerations for wildland fuel treatment in forested ecosystems of the interior western United States

    Treesearch

    Elizabeth D. Reinhardt; Robert E. Keane; David E. Calkin; Jack D. Cohen

    2008-01-01

    Many natural resource agencies and organizations recognize the importance of fuel treatments as tools for reducing fire hazards and restoring ecosystems. However, there continues to be confusion and misconception about fuel treatments and their implementation and effects in fire-prone landscapes across the United States. This paper (1) summarizes objectives, methods,...

  11. The constraint method: A new finite element technique. [applied to static and dynamic loads on plates

    NASA Technical Reports Server (NTRS)

    Tsai, C.; Szabo, B. A.

    1973-01-01

    An approch to the finite element method which utilizes families of conforming finite elements based on complete polynomials is presented. Finite element approximations based on this method converge with respect to progressively reduced element sizes as well as with respect to progressively increasing orders of approximation. Numerical results of static and dynamic applications of plates are presented to demonstrate the efficiency of the method. Comparisons are made with plate elements in NASTRAN and the high-precision plate element developed by Cowper and his co-workers. Some considerations are given to implementation of the constraint method into general purpose computer programs such as NASTRAN.

  12. Measuring geographical accessibility to rural and remote health care services: Challenges and considerations.

    PubMed

    Shah, Tayyab Ikram; Milosavljevic, Stephan; Bath, Brenna

    2017-06-01

    This research is focused on methodological challenges and considerations associated with the estimation of the geographical aspects of access to healthcare with a focus on rural and remote areas. With the assumption that GIS-based accessibility measures for rural healthcare services will vary across geographic units of analysis and estimation techniques, which could influence the interpretation of spatial access to rural healthcare services. Estimations of geographical accessibility depend on variations of the following three parameters: 1) quality of input data; 2) accessibility method; and 3) geographical area. This research investigated the spatial distributions of physiotherapists (PTs) in comparison to family physicians (FPs) across Saskatchewan, Canada. The three-steps floating catchment areas (3SFCA) method was applied to calculate the accessibility scores for both PT and FP services at two different geographical units. A comparison of accessibility scores to simple healthcare provider-to-population ratios was also calculated. The results vary considerably depending on the accessibility methods used and the choice of geographical area unit for measuring geographical accessibility for both FP and PT services. These findings raise intriguing questions regarding the nature and extent of technical issues and methodological considerations that can affect GIS-based measures in health services research and planning. This study demonstrates how the selection of geographical areal units and different methods for measuring geographical accessibility could affect the distribution of healthcare resources in rural areas. These methodological issues have implications for determining where there is reduced access that will ultimately impact health human resource priorities and policies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Aerodynamic characteristics of the Fiat UNO car

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costelli, A.F.

    1984-01-01

    The purpose of this article is to describe the work conducted in the aerodynamic field throughout the 4-year development and engineering time span required by the project of the UNO car. A description is given of all the parametric studies carried out. Through these studies two types of cars at present in production were defined and the characteristics of a possible future sports version laid down. A movable device, to be fitted in the back window, was also set up and patented. When actuated it reduces soiling of back window. A description is also provided of the measurements made inmore » the car flow field and some considerations are outlined about the method applied. This method is still in development phase but it already permits some considerations and in-depth investigations to be made on the vehicle wake.« less

  14. Current problems in applied mathematics and mathematical physics

    NASA Astrophysics Data System (ADS)

    Samarskii, A. A.

    Papers are presented on such topics as mathematical models in immunology, mathematical problems of medical computer tomography, classical orthogonal polynomials depending on a discrete variable, and boundary layer methods for singular perturbation problems in partial derivatives. Consideration is also given to the computer simulation of supernova explosion, nonstationary internal waves in a stratified fluid, the description of turbulent flows by unsteady solutions of the Navier-Stokes equations, and the reduced Galerkin method for external diffraction problems using the spline approximation of fields.

  15. On the track for an efficient detection of Escherichia coli in water: A review on PCR-based methods.

    PubMed

    Mendes Silva, Diana; Domingues, Lucília

    2015-03-01

    Ensuring water safety is an ongoing challenge to public health providers. Assessing the presence of fecal contamination indicators in water is essential to protect public health from diseases caused by waterborne pathogens. For this purpose, the bacteria Escherichia coli has been used as the most reliable indicator of fecal contamination in water. The methods currently in use for monitoring the microbiological safety of water are based on culturing the microorganisms. However, these methods are not the desirable solution to prevent outbreaks as they provide the results with a considerable delay, lacking on specificity and sensitivity. Moreover, viable but non-culturable microorganisms, which may be present as a result of environmental stress or water treatment processes, are not detected by culture-based methods and, thus, may result in false-negative assessments of E. coli in water samples. These limitations may place public health at significant risk, leading to substantial monetary losses in health care and, additionally, in costs related with a reduced productivity in the area affected by the outbreak, and in costs supported by the water quality control departments involved. Molecular methods, particularly polymerase chain reaction-based methods, have been studied as an alternative technology to overcome the current limitations, as they offer the possibility to reduce the assay time, to improve the detection sensitivity and specificity, and to identify multiple targets and pathogens, including new or emerging strains. The variety of techniques and applications available for PCR-based methods has increased considerably and the costs involved have been substantially reduced, which together have contributed to the potential standardization of these techniques. However, they still require further refinement in order to be standardized and applied to the variety of environmental waters and their specific characteristics. The PCR-based methods under development for monitoring the presence of E. coli in water are here discussed. Special emphasis is given to methodologies that avoid pre-enrichment during the water sample preparation process so that the assay time is reduced and the required legislated sensitivity is achieved. The advantages and limitations of these methods are also reviewed, contributing to a more comprehensive overview toward a more conscious research in identifying E. coli in water. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Improved Conjugate Gradient Bundle Adjustment of Dunhuang Wall Painting Images

    NASA Astrophysics Data System (ADS)

    Hu, K.; Huang, X.; You, H.

    2017-09-01

    Bundle adjustment with additional parameters is identified as a critical step for precise orthoimage generation and 3D reconstruction of Dunhuang wall paintings. Due to the introduction of self-calibration parameters and quasi-planar constraints, the structure of coefficient matrix of the reduced normal equation is banded-bordered, making the solving process of bundle adjustment complex. In this paper, Conjugate Gradient Bundle Adjustment (CGBA) method is deduced by calculus of variations. A preconditioning method based on improved incomplete Cholesky factorization is adopt to reduce the condition number of coefficient matrix, as well as to accelerate the iteration rate of CGBA. Both theoretical analysis and experimental results comparison with conventional method indicate that, the proposed method can effectively conquer the ill-conditioned problem of normal equation and improve the calculation efficiency of bundle adjustment with additional parameters considerably, while maintaining the actual accuracy.

  17. Removal of bacteria from boar ejaculates by Single Layer Centrifugation can reduce the use of antibiotics in semen extenders.

    PubMed

    Morrell, J M; Wallgren, M

    2011-01-01

    There is considerable interest world-wide in reducing the use of antibiotics to stem the development of antibiotic-resistant strains of bacteria. An alternative to the routine addition of antibiotics to semen extenders in livestock breeding would be to separate the spermatozoa from bacterial contaminants in the semen immediately after collection. The present study was designed to determine whether such separation was possible by Single Layer Centrifugation (SLC) using the colloid Androcoll™-P. The results showed that complete removal (6 out of 10 samples), or considerable reduction of bacterial contaminants (4 out of 10 samples) was possible with this method. The type of bacteria and/or the length of time between collection and SLC-processing affected the removal of bacteria, with motile flagellated bacteria being more likely to be present after SLC than non-flagellated bacteria. Although further studies are necessary, these preliminary results suggest that the use of SLC when processing boar semen for AI doses might enable antibiotic usage in semen extenders to be reduced. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. Implementing AORN recommended practices for MIS: Part II.

    PubMed

    Morton, Paula J

    2012-10-01

    This article focuses on the equipment and workplace safety aspects of the revised AORN "Recommended practices for minimally invasive surgery." A multidisciplinary team that includes the perioperative nurse should be established to discuss aspects of the development and design of new construction or renovation (eg, room access, ergonomics, low-lighting, OR integration, hybrid OR considerations, design development). Equipment safety considerations during minimally invasive surgical procedures include using active electrode monitoring; verifying the properties of distention media; using smoke evacuation systems; reducing equipment, electrical, thermal, and fire hazards; performing routine safety checks on insufflation accessories; and minimizing the risk of ergonomic injuries to staff members. Additional considerations include using video recording devices, nonmagnetic equipment during magnetic resonance imaging, and fluid containment methods for fluid management. Copyright © 2012 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  19. Evaluation of Two Outlier-Detection-Based Methods for Detecting Tissue-Selective Genes from Microarray Data

    PubMed Central

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-01-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent’s non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent’s method is not suitable for ROKU. PMID:19936074

  20. The Worth of a Sparrow: A Decision Case in University Research and Public Relations.

    ERIC Educational Resources Information Center

    Crookston, R. Kent; And Others

    1993-01-01

    The University of Minnesota trapped and killed birds to reduce bird damage to research grain plots. When the Animal Rights Coalition demanded the practice be stopped, the situation became a public controversy. Presents an abridged form of this case as a focus for consideration of research methods, interest group agenda, and the universities' role…

  1. Characteristics linked to the reduction of stigma towards schizophrenia: a pre-and-post study of parents of adolescents attending an educational program.

    PubMed

    Ling, Yiwei; Watanabe, Mayumi; Yoshii, Hatsumi; Akazawa, Kouhei

    2014-03-18

    The stigma of schizophrenia constitutes a major barrier to early detection and treatment of this illness. Anti-stigma education has been welcomed to reduce stigma among the general public. This study examined the factors associated with the effectiveness of a web-based educational program designed to reduce the stigma associated with schizophrenia. Using Link's Devaluation-Discrimination Scale to measure stigma, the effect of the program was measured by the difference in pre- and post-program tests. In the present study, we focused on program participants whose stigma towards schizophrenia had considerably improved (a reduction of three points or more between pre- and post-program tests) or considerably worsened (an increase of three points or more). The study participants were 1,058 parents of middle or high school students across Japan, including 508 whose stigma had significantly decreased after the program and 550 whose stigma had significantly increased. We used multiple logistic regression analysis to predict a considerable reduction in stigma (by three or more points) using independent variables measured before exposure to the program. In these models, we assessed the effects of demographic characteristics of the participants and four measures of knowledge and views on schizophrenia (basic knowledge, Link's Devaluation-Discrimination Scale, ability to distinguish schizophrenia from other conditions, and social distance). Participants' employment status, occupation, basic knowledge of schizophrenia, pre-program Link's Devaluation-Discrimination Scale score, and social distance were significant factors associated with a considerable decrease in the stigma attached to schizophrenia following the educational program. Specifically, full-time and part-time employees were more likely to experience reduced stigma than parents who were self-employed, unemployed, or had other employment status. Considerable decreases in stigma were more likely among parents working in transportation and communication or as homemakers than among other occupational groups. In addition, parents with higher pre-program levels of stigma, lower basic knowledge, or lower social distance were more likely to have reduced levels of stigma. Based on the regression analysis results presented here, several possible methods of reducing stigma were suggested, including increasing personal contact with people with schizophrenia and the improvement of law and insurance systems in primary and secondary industries.

  2. Minimum maximum temperature gradient coil design.

    PubMed

    While, Peter T; Poole, Michael S; Forbes, Larry K; Crozier, Stuart

    2013-08-01

    Ohmic heating is a serious problem in gradient coil operation. A method is presented for redesigning cylindrical gradient coils to operate at minimum peak temperature, while maintaining field homogeneity and coil performance. To generate these minimaxT coil windings, an existing analytic method for simulating the spatial temperature distribution of single layer gradient coils is combined with a minimax optimization routine based on sequential quadratic programming. Simulations are provided for symmetric and asymmetric gradient coils that show considerable improvements in reducing maximum temperature over existing methods. The winding patterns of the minimaxT coils were found to be heavily dependent on the assumed thermal material properties and generally display an interesting "fish-eye" spreading of windings in the dense regions of the coil. Small prototype coils were constructed and tested for experimental validation and these demonstrate that with a reasonable estimate of material properties, thermal performance can be improved considerably with negligible change to the field error or standard figures of merit. © 2012 Wiley Periodicals, Inc.

  3. A Novel Design of an Automatic Lighting Control System for a Wireless Sensor Network with Increased Sensor Lifetime and Reduced Sensor Numbers

    PubMed Central

    Mohamaddoust, Reza; Haghighat, Abolfazl Toroghi; Sharif, Mohamad Javad Motahari; Capanni, Niccolo

    2011-01-01

    Wireless sensor networks (WSN) are currently being applied to energy conservation applications such as light control. We propose a design for such a system called a Lighting Automatic Control System (LACS). The LACS system contains a centralized or distributed architecture determined by application requirements and space usage. The system optimizes the calculations and communications for lighting intensity, incorporates user illumination requirements according to their activities and performs adjustments based on external lighting effects in external sensor and external sensor-less architectures. Methods are proposed for reducing the number of sensors required and increasing the lifetime of those used, for considerably reduced energy consumption. Additionally we suggest methods for improving uniformity of illuminance distribution on a workplane’s surface, which improves user satisfaction. Finally simulation results are presented to verify the effectiveness of our design. PMID:22164114

  4. A novel design of an automatic lighting control system for a wireless sensor network with increased sensor lifetime and reduced sensor numbers.

    PubMed

    Mohamaddoust, Reza; Haghighat, Abolfazl Toroghi; Sharif, Mohamad Javad Motahari; Capanni, Niccolo

    2011-01-01

    Wireless sensor networks (WSN) are currently being applied to energy conservation applications such as light control. We propose a design for such a system called a lighting automatic control system (LACS). The LACS system contains a centralized or distributed architecture determined by application requirements and space usage. The system optimizes the calculations and communications for lighting intensity, incorporates user illumination requirements according to their activities and performs adjustments based on external lighting effects in external sensor and external sensor-less architectures. Methods are proposed for reducing the number of sensors required and increasing the lifetime of those used, for considerably reduced energy consumption. Additionally we suggest methods for improving uniformity of illuminance distribution on a workplane's surface, which improves user satisfaction. Finally simulation results are presented to verify the effectiveness of our design.

  5. The Method of Fundamental Solutions using the Vector Magnetic Dipoles for Calculation of the Magnetic Fields in the Diagnostic Problems Based on Full-Scale Modelling Experiment

    NASA Astrophysics Data System (ADS)

    Bakhvalov, Yu A.; Grechikhin, V. V.; Yufanova, A. L.

    2016-04-01

    The article describes the calculation of the magnetic fields in the problems diagnostic of technical systems based on the full-scale modeling experiment. Use of gridless fundamental solution method and its variants in combination with grid methods (finite differences and finite elements) are allowed to considerably reduce the dimensionality task of the field calculation and hence to reduce calculation time. When implementing the method are used fictitious magnetic charges. In addition, much attention is given to the calculation accuracy. Error occurs when wrong choice of the distance between the charges. The authors are proposing to use vector magnetic dipoles to improve the accuracy of magnetic fields calculation. Examples of this approacharegiven. The article shows the results of research. They are allowed to recommend the use of this approach in the method of fundamental solutions for the full-scale modeling tests of technical systems.

  6. Transfer Alignment Error Compensator Design Based on Robust State Estimation

    NASA Astrophysics Data System (ADS)

    Lyou, Joon; Lim, You-Chol

    This paper examines the transfer alignment problem of the StrapDown Inertial Navigation System (SDINS), which is subject to the ship’s roll and pitch. Major error sources for velocity and attitude matching are lever arm effect, measurement time delay and ship-body flexure. To reduce these alignment errors, an error compensation method based on state augmentation and robust state estimation is devised. A linearized error model for the velocity and attitude matching transfer alignment system is derived first by linearizing the nonlinear measurement equation with respect to its time delay and dominant Y-axis flexure, and by augmenting the delay state and flexure state into conventional linear state equations. Then an H∞ filter is introduced to account for modeling uncertainties of time delay and the ship-body flexure. The simulation results show that this method considerably decreases azimuth alignment errors considerably.

  7. Putting Teeth into Open Architectures: Infrastructure for Reducing the Need for Retesting

    DTIC Science & Technology

    2007-04-30

    the test and evaluation team. This paper outlines new approaches to quality assurance and testing that are better suited for providing...reconfiguration. Testing of reusable subsystems is also subject to the above considerations and, similarly, requires new methods for effectively achieving...architectural model. Thus, fully realizing the open architecture vision requires a new paradigm for test and evaluation. We propose such a

  8. Humidity Steady State Low Voltage Testing of MLCCs (Based on NESC Technical Assessment Report)

    NASA Technical Reports Server (NTRS)

    Sampson, Mike; Brusse, Jay; Teverovsky, Alexander

    2011-01-01

    Review of the low voltage reduced Insulation Resistance (IR) failure phenomenon in Multilayer ceramic capacitors (MLCCs)and NASA approaches to contend with this risk. 1. Analyze published materials on root cause mechanisms. 2. Investigate suitability of current test methods to assess MLCC lots for susceptibility. 3. Review current NASA parts selection and application guidelines in consideration of benefits vs. disadvantages.

  9. Ecological Momentary Assessment is a Neglected Methodology in Suicidology.

    PubMed

    Davidson, Collin L; Anestis, Michael D; Gutierrez, Peter M

    2017-01-02

    Ecological momentary assessment (EMA) is a group of research methods that collect data frequently, in many contexts, and in real-world settings. EMA has been fairly neglected in suicidology. The current article provides an overview of EMA for suicidologists including definitions, data collection considerations, and different sampling strategies. Next, the benefits of EMA in suicidology (i.e., reduced recall bias, accurate tracking of fluctuating variables, testing assumptions of theories, use in interventions), participant safety considerations, and examples of published research that investigate self-directed violence variables using EMA are discussed. The article concludes with a summary and suggested directions for EMA research in suicidology with the particular aim to spur the increased use of this methodology among suicidologists.

  10. Radiation Shielding for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Caffrey, Jarvis A.

    2016-01-01

    Design and analysis of radiation shielding for nuclear thermal propulsion has continued at Marshall Space Flight Center. A set of optimization tools are in development, and strategies for shielding optimization will be discussed. Considerations for the concurrent design of internal and external shielding are likely required for a mass optimal shield design. The task of reducing radiation dose to crew from a nuclear engine is considered to be less challenging than the task of thermal mitigation for cryogenic propellant, especially considering the likely implementation of additional crew shielding for protection from solar particles and cosmic rays. Further consideration is thus made for the thermal effects of radiation absorption in cryogenic propellant. Materials challenges and possible methods of manufacturing are also discussed.

  11. A High Power Density Single-Phase PWM Rectifier with Active Ripple Energy Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ning, Puqi; Wang, Ruxi; Wang, Fei

    It is well known that there exist second-order harmonic current and corresponding ripple voltage on dc bus for single phase PWM rectifiers. The low frequency harmonic current is normally filtered using a bulk capacitor in the bus which results in low power density. This paper proposed an active ripple energy storage method that can effectively reduce the energy storage capacitance. The feed-forward control method and design considerations are provided. Simulation and 15 kW experimental results are provided for verification purposes.

  12. Oxidation-resisting technology of W-Re thermocouples and their industrial applications

    NASA Astrophysics Data System (ADS)

    Wang, K.; Dai, M.; Dong, J.; Wang, L.; Wang, T.

    2013-09-01

    We use DSC/TG, SEM and EPMA approaches to investigate the high temperature oxidation behaviors of the Type C W-Re thermocouple wires and W-Re powders which the wires were made from. To solve the oxidization of W-Re thermocouples the chemical method, other than the commonly used physical method, i.e. vacuum-pumping method, was developed. Several solid-packed techniques such as stuffing with inert material, chemical deoxidizing, gas-absorbing and sealing were employed to prevent the W-Re thermocouples from oxidizing. Based on comprehensive consideration of various parameters in process industries, a series of industrial W-Re thermocouples has been successfully used in oxidizing and reducing atmospheres, high temperature alkali and other harsh environments. The service life is 6 to 12 months in strong oxidizing atmosphere of Cr2O3-Al2O3 brick kiln and 2 to 3 months in high temperature alkali and in reducing atmosphere of CO.

  13. Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential

    DOE PAGES

    Cologne, John; Grant, Eric J.; Nakashima, Eiji; ...

    2012-01-01

    Objective . Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods . We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results . Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relativemore » accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions . When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.« less

  14. Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential

    PubMed Central

    Cologne, John; Grant, Eric J.; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki

    2012-01-01

    Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs. PMID:22505949

  15. Channel Model Optimization with Reflection Residual Component for Indoor MIMO-VLC System

    NASA Astrophysics Data System (ADS)

    Chen, Yong; Li, Tengfei; Liu, Huanlin; Li, Yichao

    2017-12-01

    A fast channel modeling method is studied to solve the problem of reflection channel gain for multiple input multiple output-visible light communications (MIMO-VLC) in the paper. For reducing the computational complexity when associating with the reflection times, no more than 3 reflections are taken into consideration in VLC. We think that higher order reflection link consists of corresponding many times line of sight link and firstly present reflection residual component to characterize higher reflection (more than 2 reflections). We perform computer simulation results for point-to-point channel impulse response, receiving optical power and receiving signal to noise ratio. Based on theoretical analysis and simulation results, the proposed method can effectively reduce the computational complexity of higher order reflection in channel modeling.

  16. An experimental study of wall adaptation and interference assessment using Cauchy integral formula

    NASA Technical Reports Server (NTRS)

    Murthy, A. V.

    1991-01-01

    This paper summarizes the results of an experimental study of combined wall adaptation and residual interference assessment using the Cauchy integral formula. The experiments were conducted on a supercritical airfoil model in the Langley 0.3-m Transonic Cryogenic Tunnel solid flexible wall test section. The ratio of model chord to test section height was about 0.7. The method worked satisfactorily in reducing the blockage interference and demonstrated the primary requirement for correcting for the blockage effects at high model incidences to correctly determine high lift characteristics. The studies show that the method has potential for reducing the residual interference to considerably low levels. However, corrections to blockage and upwash velocities gradients may still be required for the final adapted wall shapes.

  17. Aircraft engine pollution reduction.

    NASA Technical Reports Server (NTRS)

    Rudey, R. A.

    1972-01-01

    The effect of engine operation on the types and levels of the major aircraft engine pollutants is described and the major factors governing the formation of these pollutants during the burning of hydrocarbon fuel are discussed. Methods which are being explored to reduce these pollutants are discussed and their application to several experimental research programs are pointed out. Results showing significant reductions in the levels of carbon monoxide, unburned hydrocarbons, and oxides of nitrogen obtained from experimental combustion research programs are presented and discussed to point out potential application to aircraft engines. An experimental program designed to develop and demonstrate these and other advanced, low pollution combustor design methods is described. Results that have been obtained to date indicate considerable promise for reducing advanced engine exhaust pollutants to levels significantly below current engines.

  18. Application of Reduced Order Transonic Aerodynamic Influence Coefficient Matrix for Design Optimization

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley W.

    2009-01-01

    Supporting the Aeronautics Research Mission Directorate guidelines, the National Aeronautics and Space Administration [NASA] Dryden Flight Research Center is developing a multidisciplinary design, analysis, and optimization [MDAO] tool. This tool will leverage existing tools and practices, and allow the easy integration and adoption of new state-of-the-art software. Today s modern aircraft designs in transonic speed are a challenging task due to the computation time required for the unsteady aeroelastic analysis using a Computational Fluid Dynamics [CFD] code. Design approaches in this speed regime are mainly based on the manual trial and error. Because of the time required for unsteady CFD computations in time-domain, this will considerably slow down the whole design process. These analyses are usually performed repeatedly to optimize the final design. As a result, there is considerable motivation to be able to perform aeroelastic calculations more quickly and inexpensively. This paper will describe the development of unsteady transonic aeroelastic design methodology for design optimization using reduced modeling method and unsteady aerodynamic approximation. The method requires the unsteady transonic aerodynamics be represented in the frequency or Laplace domain. Dynamically linear assumption is used for creating Aerodynamic Influence Coefficient [AIC] matrices in transonic speed regime. Unsteady CFD computations are needed for the important columns of an AIC matrix which corresponded to the primary modes for the flutter. Order reduction techniques, such as Guyan reduction and improved reduction system, are used to reduce the size of problem transonic flutter can be found by the classic methods, such as Rational function approximation, p-k, p, root-locus etc. Such a methodology could be incorporated into MDAO tool for design optimization at a reasonable computational cost. The proposed technique is verified using the Aerostructures Test Wing 2 actually designed, built, and tested at NASA Dryden Flight Research Center. The results from the full order model and the approximate reduced order model are analyzed and compared.

  19. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity.

    PubMed

    Napoletano, Paolo; Piccoli, Flavio; Schettini, Raimondo

    2018-01-12

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art.

  20. Food Safety Impacts from Post-Harvest Processing Procedures of Molluscan Shellfish.

    PubMed

    Baker, George L

    2016-04-18

    Post-harvest Processing (PHP) methods are viable food processing methods employed to reduce human pathogens in molluscan shellfish that would normally be consumed raw, such as raw oysters on the half-shell. Efficacy of human pathogen reduction associated with PHP varies with respect to time, temperature, salinity, pressure, and process exposure. Regulatory requirements and PHP molluscan shellfish quality implications are major considerations for PHP usage. Food safety impacts associated with PHP of molluscan shellfish vary in their efficacy and may have synergistic outcomes when combined. Further research for many PHP methods are necessary and emerging PHP methods that result in minimal quality loss and effective human pathogen reduction should be explored.

  1. Neutron skyshine calculations for the PDX tokamak

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wheeler, F.J.; Nigg, D.W.

    1979-01-01

    The Poloidal Divertor Experiment (PDX) at Princeton will be the first operating tokamak to require a substantial radiation shield. The PDX shielding includes a water-filled roof shield over the machine to reduce air scattering skyshine dose in the PDX control room and at the site boundary. During the design of this roof shield a unique method was developed to compute the neutron source emerging from the top of the roof shield for use in Monte Carlo skyshine calculations. The method is based on simple, one-dimensional calculations rather than multidimensional calculations, resulting in considerable savings in computer time and input preparationmore » effort. This method is described.« less

  2. Two-dimensional subsonic compressible flow past elliptic cylinders

    NASA Technical Reports Server (NTRS)

    Kaplan, Carl

    1938-01-01

    The method of Poggi is used to calculate, for perfect fluids, the effect of compressibility upon the flow on the surface of an elliptic cylinder at zero angle of attack and with no circulation. The result is expressed in a closed form and represents a rigorous determination of the velocity of the fluid at the surface of the obstacle insofar as the second approximation is concerned. Comparison is made with Hooker's treatment of the same problem according to the method of Janzen and Rayleight and it is found that, for thick elliptic cylinders, the two methods agree very well. The labor of computation is considerably reduced by the present solution.

  3. A study of extraction process and in vitro antioxidant activity of total phenols from Rhizoma Imperatae.

    PubMed

    Zhou, Xian-rong; Wang, Jian-hua; Jiang, Bo; Shang, Jin; Zhao, Chang-qiong

    2013-01-01

    The study investigated the extraction method of Rhizoma Imperatae and its antioxidant activity, and provided a basis for its rational development. The extraction method of Rhizoma Imperatae was determined using orthogonal design test and by total phenol content, its hydroxyl radical scavenging ability was measured by Fenton reaction, and potassium ferricyanide reduction method was used to determine its reducing power. The results showed that the optimum extraction process of Rhizoma Imperatae was a 50-fold volume of water, 30 °C, three times of extraction with 2 h each. Its IC50 for scavenging of hydroxyl radicals was 0.0948 mg/mL, while IC50 of ascorbic acid was 0.1096 mg/mL; in the ferricyanide considerable reduction method, the extract exhibited reducing power comparable to that of the ascorbic acid. The study concluded that Rhizoma Imperatae extract contains relatively large amount of polyphenols, and has a good anti-oxidation ability.

  4. Intravenous Fluid Mixing in Normal Gravity, Partial Gravity, and Microgravity: Down-Selection of Mixing Methods

    NASA Technical Reports Server (NTRS)

    Niederhaus, Charles E.; Miller, Fletcher J.

    2008-01-01

    The missions envisioned under the Vision for Space Exploration will require development of new methods to handle crew medical care. Medications and intravenous (IV) fluids have been identified as one area needing development. Storing certain medications and solutions as powders or concentrates can both increase the shelf life and reduce the overall mass and volume of medical supplies. The powders or concentrates would then be mixed in an IV bag with Sterile Water for Injection produced in situ from the potable water supply. Fluid handling in microgravity is different than terrestrial settings, and requires special consideration in the design of equipment. This document describes the analyses and down-select activities used to identify the IV mixing method to be developed that is suitable for ISS and exploration missions. The chosen method is compatible with both normal gravity and microgravity, maintains sterility of the solution, and has low mass and power requirements. The method will undergo further development, including reduced gravity aircraft experiments and computations, in order to fully develop the mixing method and associated operational parameters.

  5. Fault Tolerant Considerations and Methods for Guidance and Control Systems

    DTIC Science & Technology

    1987-07-01

    multifunction devices such as microprocessors with software. In striving toward the economic goal, however, a cost is incurred in a different coin, i.e...therefore been developed which reduces the software risk to acceptable proportions. Several of the techniques thus developed incur no significant cost ...complex that their design and implementation need computerized tools in order to be cost -effective (in a broad sense, including the capability of

  6. LSSA (Low-cost Silicon Solar Array) project

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Methods are explored for economically generating electrical power to meet future requirements. The Low-Cost Silicon Solar Array Project (LSSA) was established to reduce the price of solar arrays by improving manufacturing technology, adapting mass production techniques, and promoting user acceptance. The new manufacturing technology includes the consideration of new silicon refinement processes, silicon sheet growth techniques, encapsulants, and automated assembly production being developed under contract by industries and universities.

  7. Determination of pseudogap state density and carrier mobility in rf sputtered amorphous silicon. Quarterly technical progress report, January-March 31, 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, W

    1980-06-01

    The effect of a variety of plasma cleaning procedures on the level of bulk and interfacial contaminants in the films is analyzed by secondary ion mass spectrometry. Bulk levels of 0 have been reduced considerably by N/sub 2/ plasma cleaning, but no reproducible reductions in interfacial contamination have been achieved. A method is described of determining the gap state density N(epsilon) of a-Si:H from field effect, in which no assumptions are made about the form of the band bending in the semiconductor. The problem is reduced to three successive integrals over an assumed N(epsilon) by change of variable from distancemore » to applied voltage and the best fit to the experimental data is obtained by iteration of the assumed state density. The method is shown to be no less rigorous and considerably more economical than the recent analysis of Goodman, Fritzsche and Ozaki. In addition, an experimental means of determining the flat-band voltage to within 5% of the maximum gate voltage V/sub g/ used is demonstrated, by finding the value of V/sub g/ for which (kT/e)dlog I/sub SD//dV/sub g/ is independent of temperature.« less

  8. Key considerations in designing a patient navigation program for colorectal cancer screening.

    PubMed

    DeGroff, Amy; Coa, Kisha; Morrissey, Kerry Grace; Rohan, Elizabeth; Slotman, Beth

    2014-07-01

    Colorectal cancer is the second leading cause of cancer mortality among those cancers affecting both men and women. Screening is known to reduce mortality by detecting cancer early and through colonoscopy, removing precancerous polyps. Only 58.6% of adults are currently up-to-date with colorectal cancer screening by any method. Patient navigation shows promise in increasing adherence to colorectal cancer screening and reducing health disparities; however, it is a complex intervention that is operationalized differently across institutions. This article describes 10 key considerations in designing a patient navigation intervention for colorectal cancer screening based on a literature review and environmental scan. Factors include (1) identifying a theoretical framework and setting program goals, (2) specifying community characteristics, (3) establishing the point(s) of intervention within the cancer continuum, (4) determining the setting in which navigation services are provided, (5) identifying the range of services offered and patient navigator responsibilities, (6) determining the background and qualifications of navigators, (7) selecting the method of communications between patients and navigators, (8) designing the navigator training, (9) defining oversight and supervision for the navigators, and (10) evaluating patient navigation. Public health practitioners can benefit from the practical perspective offered here for designing patient navigation programs. © 2013 Society for Public Health Education.

  9. A Statistical Method for Reducing Sidelobe Clutter for the Ku-Band Precipitation Radar on Board the GPM Core Observatory

    NASA Technical Reports Server (NTRS)

    Kubota, Takuji; Iguchi, Toshio; Kojima, Masahiro; Liao, Liang; Masaki, Takeshi; Hanado, Hiroshi; Meneghini, Robert; Oki, Riko

    2016-01-01

    A statistical method to reduce the sidelobe clutter of the Ku-band precipitation radar (KuPR) of the Dual-Frequency Precipitation Radar (DPR) on board the Global Precipitation Measurement (GPM) Core Observatory is described and evaluated using DPR observations. The KuPR sidelobe clutter was much more severe than that of the Precipitation Radar on board the Tropical Rainfall Measuring Mission (TRMM), and it has caused the misidentification of precipitation. The statistical method to reduce sidelobe clutter was constructed by subtracting the estimated sidelobe power, based upon a multiple regression model with explanatory variables of the normalized radar cross section (NRCS) of surface, from the received power of the echo. The saturation of the NRCS at near-nadir angles, resulting from strong surface scattering, was considered in the calculation of the regression coefficients.The method was implemented in the KuPR algorithm and applied to KuPR-observed data. It was found that the received power from sidelobe clutter over the ocean was largely reduced by using the developed method, although some of the received power from the sidelobe clutter still remained. From the statistical results of the evaluations, it was shown that the number of KuPR precipitation events in the clutter region, after the method was applied, was comparable to that in the clutter-free region. This confirms the reasonable performance of the method in removing sidelobe clutter. For further improving the effectiveness of the method, it is necessary to improve the consideration of the NRCS saturation, which will be explored in future work.

  10. Electronic structure modifications and band gap narrowing in Zn0.95V0.05O

    NASA Astrophysics Data System (ADS)

    Ahad, Abdul; Majid, S. S.; Rahman, F.; Shukla, D. K.; Phase, D. M.

    2018-04-01

    We present here, structural, optical and electronic structure studies on Zn0.95V0.05O, synthesized using solid state method. Rietveld refinement of x-ray diffraction pattern indicates no considerable change in the lattice of doped ZnO. The band gap of doped sample, as calculated by Kubelka-Munk transformed reflectance spectra, has been found reduced compared to pure ZnO. Considerable changes in absorbance in UV-Vis range is observed in doped sample. V doping induced decrease in band gap is supported by x-ray absorption spectroscopy measurements. It is experimentally confirmed that conduction band edge in Zn0.95V0.05O has shifted towards Fermi level than in pure ZnO.

  11. Electrodeposition of reduced graphene oxide with chitosan based on the coordination deposition method

    PubMed Central

    Liu, Mingyang; Qin, Chaoran; Zhang, Zheng; Ma, Shuai; Cai, Xiuru; Li, Xueqian

    2018-01-01

    The electrodeposition of graphene has drawn considerable attention due to its appealing applications for sensors, supercapacitors and lithium-ion batteries. However, there are still some limitations in the current electrodeposition methods for graphene. Here, we present a novel electrodeposition method for the direct deposition of reduced graphene oxide (rGO) with chitosan. In this method, a 2-hydroxypropyltrimethylammonium chloride-based chitosan-modified rGO material was prepared. This material disperses homogenously in the chitosan solution, forming a deposition solution with good dispersion stability. Subsequently, the modified rGO material was deposited on an electrode through codeposition with chitosan, based on the coordination deposition method. After electrodeposition, the homogeneous, deposited rGO/chitosan films can be generated on copper or silver electrodes or substrates. The electrodeposition method allows for the convenient and controlled creation of rGO/chitosan nanocomposite coatings and films of different shapes and thickness. It also introduces a new method of creating films, as they can be peeled completely from the electrodes. Moreover, this method allows for a rGO/chitosan film to be deposited directly onto an electrode, which can then be used for electrochemical detection. PMID:29765797

  12. A Cost Effective Block Framing Scheme for Underwater Communication

    PubMed Central

    Shin, Soo-Young; Park, Soo-Hyun

    2011-01-01

    In this paper, the Selective Multiple Acknowledgement (SMA) method, based on Multiple Acknowledgement (MA), is proposed to efficiently reduce the amount of data transmission by redesigning the transmission frame structure and taking into consideration underwater transmission characteristics. The method is suited to integrated underwater system models, as the proposed method can handle the same amount of data in a much more compact frame structure without any appreciable loss of reliability. Herein, the performance of the proposed SMA method was analyzed and compared to those of the conventional Automatic Repeat-reQuest (ARQ), Block Acknowledgement (BA), block response, and MA methods. The efficiency of the underwater sensor network, which forms a large cluster and mostly contains uplink data, is expected to be improved by the proposed method. PMID:22247689

  13. [Consideration of the deuterium-free water supply to an expedition to Mars].

    PubMed

    Siniak, Iu E; Turusov, V S; Grigor'ev, A I; Zaridze, D G; Gaĭdadymov, V B; Gus'kova, E I; Antoshina, E E; Gor'kova, T G; Trukhanova, L S

    2003-01-01

    Interplanetary missions, including to Mars, will put crews into severe radiation conditions. Search for methods of reducing the risk of radiation-induced cancer is of the top priority in preparation for the mission to Mars. One of the options is designing life support systems that will generate water with low content of the stable hydrogen isotope (deuterium) to be consumed by crewmembers. Preliminary investigations have shown that a decrease of the deuterium fraction by 65% does impart to water certain anti-cancer properties. Therefore, drinking deuterium-free water has the potential to reduce the risk of cancer consequent to the extreme radiation exposure of the Martian crew.

  14. Variational method of determining effective moduli of polycrystals: (A) hexagonal symmetry, (B) trigonal symmetry

    USGS Publications Warehouse

    Peselnick, L.; Meister, R.

    1965-01-01

    Variational principles of anisotropic elasticity have been applied to aggregates of randomly oriented pure-phase polycrystals having hexagonal symmetry and trigonal symmetry. The bounds of the effective elastic moduli obtained in this way show a considerable improvement over the bounds obtained by means of the Voigt and Reuss assumptions. The Hill average is found to be in most cases a good approximation when compared to the bounds found from the variational method. The new bounds reduce in their limits to the Voigt and Reuss values. ?? 1965 The American Institute of Physics.

  15. An approach for estimating the magnetization direction of magnetic anomalies

    NASA Astrophysics Data System (ADS)

    Li, Jinpeng; Zhang, Yingtang; Yin, Gang; Fan, Hongbo; Li, Zhining

    2017-02-01

    An approach for estimating the magnetization direction of magnetic anomalies in the presence of remanent magnetization through correlation between normalized source strength (NSS) and reduced-to-the-pole (RTP) is proposed. The observation region was divided into several calculation areas and the RTP field was transformed using different assumed values of the magnetization directions. Following this, the cross-correlation between NSS and RTP field was calculated, and it was found that the correct magnetization direction was that corresponding to the maximum cross-correlation value. The approach was tested on both simulated and real magnetic data. The results showed that the approach was effective in a variety of situations and considerably reduced the effect of remanent magnetization. Thus, the method using NSS and RTP is more effective compared to other methods such as using the total magnitude anomaly and RTP.

  16. An introduction to testing parachutes in wind tunnels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macha, J.

    1991-01-01

    This paper reviews some of the technical considerations and current practices for testing parachutes in conventional wind tunnels. Special challenges to the experimentalist caused by the fabric construction, flexible geometry, and buff shape of parachutes are discussed. In particular, the topics of measurement technique, similarity considerations, and wall interference are addressed in a summary manner. Many references are cited which provide detailed coverage of the state of the art in testing methods. From the discussions presented, it is obvious that there are some serious problems with state of the art methods, especially in the area of canopy instrumentation and whenmore » working with reduced-scale models. But if the experimentalist is informed about the relative importance of the various factors for a specific test objective, it is usually possible to design a test that will yield meaningful results. The lower cost and the more favorable measurement environment of wind tunnels make their use an attractive alternative to flight testing whenever possible. 26 refs., 5 figs., 1 tab.« less

  17. Study of activity based costing implementation for palm oil production using value-added and non-value-added activity consideration in PT XYZ palm oil mill

    NASA Astrophysics Data System (ADS)

    Sembiring, M. T.; Wahyuni, D.; Sinaga, T. S.; Silaban, A.

    2018-02-01

    Cost allocation at manufacturing industry particularly in Palm Oil Mill still widely practiced based on estimation. It leads to cost distortion. Besides, processing time determined by company is not in accordance with actual processing time in work station. Hence, the purpose of this study is to eliminates non-value-added activities therefore processing time could be shortened and production cost could be reduced. Activity Based Costing Method is used in this research to calculate production cost with Value Added and Non-Value-Added Activities consideration. The result of this study is processing time decreasing for 35.75% at Weighting Bridge Station, 29.77% at Sorting Station, 5.05% at Loading Ramp Station, and 0.79% at Sterilizer Station. Cost of Manufactured for Crude Palm Oil are IDR 5.236,81/kg calculated by Traditional Method, IDR 4.583,37/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.581,71/kg after implementation of Activity Improvement Meanwhile Cost of Manufactured for Palm Kernel are IDR 2.159,50/kg calculated by Traditional Method, IDR 4.584,63/kg calculated by Activity Based Costing Method before implementation of Activity Improvement and IDR 4.582,97/kg after implementation of Activity Improvement.

  18. Low rank alternating direction method of multipliers reconstruction for MR fingerprinting.

    PubMed

    Assländer, Jakob; Cloos, Martijn A; Knoll, Florian; Sodickson, Daniel K; Hennig, Jürgen; Lattanzi, Riccardo

    2018-01-01

    The proposed reconstruction framework addresses the reconstruction accuracy, noise propagation and computation time for magnetic resonance fingerprinting. Based on a singular value decomposition of the signal evolution, magnetic resonance fingerprinting is formulated as a low rank (LR) inverse problem in which one image is reconstructed for each singular value under consideration. This LR approximation of the signal evolution reduces the computational burden by reducing the number of Fourier transformations. Also, the LR approximation improves the conditioning of the problem, which is further improved by extending the LR inverse problem to an augmented Lagrangian that is solved by the alternating direction method of multipliers. The root mean square error and the noise propagation are analyzed in simulations. For verification, in vivo examples are provided. The proposed LR alternating direction method of multipliers approach shows a reduced root mean square error compared to the original fingerprinting reconstruction, to a LR approximation alone and to an alternating direction method of multipliers approach without a LR approximation. Incorporating sensitivity encoding allows for further artifact reduction. The proposed reconstruction provides robust convergence, reduced computational burden and improved image quality compared to other magnetic resonance fingerprinting reconstruction approaches evaluated in this study. Magn Reson Med 79:83-96, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  19. Electrical, Electronic and Electromechanical (EEE) Parts in the New Space Paradigm: When is Better the Enemy of Good Enough?

    NASA Technical Reports Server (NTRS)

    LaBel, Kenneth A.; Sampson, Michael J.

    2017-01-01

    As the space business rapidly evolves to accommodate a lower cost model of development and operation via concepts such as commercial space and small spacecraft (aka, CubeSats and swarms), traditional EEE parts screening and qualification methods are being scrutinized under a risk-reward trade space. In this presentation, two basic concepts will be discussed: (1) The movement from complete risk aversion EEE parts methods to managing and/or accepting risk via alternate approaches; and, (2) A discussion of emerging assurance methods to reduce overdesign as well emerging model based mission assurance (MBMA) concepts. center dot Example scenarios will be described as well as consideration for trading traditional versus alternate methods.

  20. Anomaly Detection in Nanofibrous Materials by CNN-Based Self-Similarity

    PubMed Central

    Schettini, Raimondo

    2018-01-01

    Automatic detection and localization of anomalies in nanofibrous materials help to reduce the cost of the production process and the time of the post-production visual inspection process. Amongst all the monitoring methods, those exploiting Scanning Electron Microscope (SEM) imaging are the most effective. In this paper, we propose a region-based method for the detection and localization of anomalies in SEM images, based on Convolutional Neural Networks (CNNs) and self-similarity. The method evaluates the degree of abnormality of each subregion of an image under consideration by computing a CNN-based visual similarity with respect to a dictionary of anomaly-free subregions belonging to a training set. The proposed method outperforms the state of the art. PMID:29329268

  1. Analysis and modification of theory for impact of seaplanes on water

    NASA Technical Reports Server (NTRS)

    Mayo, Wilbur L

    1945-01-01

    An analysis of available theory on seaplane impact and a proposed modification thereto are presented. In previous methods the overall momentum of the float and virtual mass has been assumed to remain constant during the impact but the present analysis shows that this assumption is rigorously correct only when the resultant velocity of the float is normal to the keel. The proposed modification chiefly involves consideration of the fact that forward velocity of the seaplane float causes momentum to be passed into the hydrodynamic downwash (an action that is the entire consideration in the case of the planing float) and consideration of the fact that, for an impact with trim, the rate of penetration is determined not only by the velocity component normal to the keel but also by the velocity component parallel to the keel, which tends to reduce the penetration. Experimental data for planing, oblique impact, and vertical drop are used to show that the accuracy of the proposed theory is good.

  2. Comparison of Sensor Selection Mechanisms for an ERP-Based Brain-Computer Interface

    PubMed Central

    Metzen, Jan H.

    2013-01-01

    A major barrier for a broad applicability of brain-computer interfaces (BCIs) based on electroencephalography (EEG) is the large number of EEG sensor electrodes typically used. The necessity for this results from the fact that the relevant information for the BCI is often spread over the scalp in complex patterns that differ depending on subjects and application scenarios. Recently, a number of methods have been proposed to determine an individual optimal sensor selection. These methods have, however, rarely been compared against each other or against any type of baseline. In this paper, we review several selection approaches and propose one additional selection criterion based on the evaluation of the performance of a BCI system using a reduced set of sensors. We evaluate the methods in the context of a passive BCI system that is designed to detect a P300 event-related potential and compare the performance of the methods against randomly generated sensor constellations. For a realistic estimation of the reduced system's performance we transfer sensor constellations found on one experimental session to a different session for evaluation. We identified notable (and unanticipated) differences among the methods and could demonstrate that the best method in our setup is able to reduce the required number of sensors considerably. Though our application focuses on EEG data, all presented algorithms and evaluation schemes can be transferred to any binary classification task on sensor arrays. PMID:23844021

  3. Design considerations for a pressure-driven multi-stage rocket

    NASA Astrophysics Data System (ADS)

    Sauerwein, Steven Craig

    2002-01-01

    The purpose of this study was to examine the feasibility of using propellant tank pressurization to eliminate the use of high-pressure turbopumps in multi-stage liquid-fueled satellite launchers. Several new technologies were examined to reduce the mass of such a rocket. Composite materials have a greater strength-to-weight ratio than metals and can be used to reduce the weight of rocket propellant tanks and structure. Catalytically combined hydrogen and oxygen can be used to heat pressurization gas, greatly reducing the amount of gas required. Ablatively cooled rocket engines can reduce the complexity and cost of the rocket. Methods were derived to estimate the mass of the various rocket components. These included a method to calculate the amount of gas needed to pressurize a propellant tank by modeling the behavior of the pressurization gas as the liquid propellant flows out of the tank. A way to estimate the mass and size of a ablatively cooled composite cased rocket engine. And a method to model the flight of such a rocket through the atmosphere in conjunction with optimization of the rockets trajectory. The results show that while a liquid propellant rocket using tank pressurization are larger than solid propellant rockets and turbopump driven liquid propellant rockets, they are not impractically large.

  4. Electron/Ion Transport Enhancer in High Capacity Li-Ion Battery Anodes

    DOE PAGES

    Kwon, Yo Han; Minnici, Krysten; Huie, Matthew M.; ...

    2016-08-30

    In this paper, magnetite (Fe 3O 4) was used as a model high capacity metal oxide active material to demonstrate advantages derived from consideration of both electron and ion transport in the design of composite battery electrodes. The conjugated polymer, poly[3-(potassium-4-butanoate) thiophene] (PPBT), was introduced as a binder component, while polyethylene glycol (PEG) was coated onto the surface of Fe 3O 4 nanoparticles. The introduction of PEG reduced aggregate size, enabled effective dispersion of the active materials and facilitated ionic conduction. As a binder for the composite electrode, PPBT underwent electrochemical doping which enabled the formation of effective electrical bridgesmore » between the carbon and Fe 3O 4 components, allowing for more efficient electron transport. Additionally, the PPBT carboxylic moieties effect a porous structure, and stable electrode performance. Finally, the methodical consideration of both enhanced electron and ion transport by introducing a carboxylated PPBT binder and PEG surface treatment leads to effectively reduced electrode resistance, which improved cycle life performance and rate capabilities.« less

  5. Nurse administrators' intentions and considerations in recruiting inactive nurses.

    PubMed

    Yu, Hsing-Yi; Tang, Fu-In; Chen, I-Ju; Yin, Teresa J C; Chen, Chu-Chieh; Yu, Shu

    2016-07-01

    To understand nurse administrators' intentions and considerations in recruiting inactive nurses and to examine predictors of intent to recruit. Few studies have provided insight into employer intentions and considerations in recruiting inactive nurses. A census survey collected data from 392 nurse administrators via a mailing method. Overall, 89.0% of nurse administrators were willing to recruit inactive nurses. Stepwise regression analysis revealed that the only predictor of nurse administrators' intention to recruit was nurse turnover rate at the hospital. Nurse administrators perceived the most important recruiting considerations were inactive nurses' cooperation with alternating shifts, health status and nursing licence. The most frequent reasons for not recruiting were an inactive nurse's lack of understanding of the medical environment and poor nursing competence. Most hospital nurse administrators were willing to recruit inactive nurses. Inactive nurses who wish to return to work should be qualified, willing to work both day and night shifts, and in good health. Nurse administrators can reduce the nursing shortage by recruiting inactive nurses. Re-entry preparation programmes should be implemented that will provide inactive nurses with knowledge of the current medical environment and the skills required to improve their nursing competence. © 2016 John Wiley & Sons Ltd.

  6. Using Chebyshev polynomials and approximate inverse triangular factorizations for preconditioning the conjugate gradient method

    NASA Astrophysics Data System (ADS)

    Kaporin, I. E.

    2012-02-01

    In order to precondition a sparse symmetric positive definite matrix, its approximate inverse is examined, which is represented as the product of two sparse mutually adjoint triangular matrices. In this way, the solution of the corresponding system of linear algebraic equations (SLAE) by applying the preconditioned conjugate gradient method (CGM) is reduced to performing only elementary vector operations and calculating sparse matrix-vector products. A method for constructing the above preconditioner is described and analyzed. The triangular factor has a fixed sparsity pattern and is optimal in the sense that the preconditioned matrix has a minimum K-condition number. The use of polynomial preconditioning based on Chebyshev polynomials makes it possible to considerably reduce the amount of scalar product operations (at the cost of an insignificant increase in the total number of arithmetic operations). The possibility of an efficient massively parallel implementation of the resulting method for solving SLAEs is discussed. For a sequential version of this method, the results obtained by solving 56 test problems from the Florida sparse matrix collection (which are large-scale and ill-conditioned) are presented. These results show that the method is highly reliable and has low computational costs.

  7. Training and Education’s Impact on 2020 Officer Career Progression

    DTIC Science & Technology

    2006-01-01

    has become blurred. The two are often now combined in several important aspects.”9 And as andragogy (the Van Riper – Future War Paper 3 Final Draft...or Pentagon planner. Efficiency How the Marine Corps trains and educates people has changed in the last 20 years. Andragogy has grown considerably...investment made. Andragogy is about understanding and improving how adults learn. Employing the methods developed by andragogy may reduce T&E time

  8. Hierarchical screening for multiple mental disorders.

    PubMed

    Batterham, Philip J; Calear, Alison L; Sunderland, Matthew; Carragher, Natacha; Christensen, Helen; Mackinnon, Andrew J

    2013-10-01

    There is a need for brief, accurate screening when assessing multiple mental disorders. Two-stage hierarchical screening, consisting of brief pre-screening followed by a battery of disorder-specific scales for those who meet diagnostic criteria, may increase the efficiency of screening without sacrificing precision. This study tested whether more efficient screening could be gained using two-stage hierarchical screening than by administering multiple separate tests. Two Australian adult samples (N=1990) with high rates of psychopathology were recruited using Facebook advertising to examine four methods of hierarchical screening for four mental disorders: major depressive disorder, generalised anxiety disorder, panic disorder and social phobia. Using K6 scores to determine whether full screening was required did not increase screening efficiency. However, pre-screening based on two decision tree approaches or item gating led to considerable reductions in the mean number of items presented per disorder screened, with estimated item reductions of up to 54%. The sensitivity of these hierarchical methods approached 100% relative to the full screening battery. Further testing of the hierarchical screening approach based on clinical criteria and in other samples is warranted. The results demonstrate that a two-phase hierarchical approach to screening multiple mental disorders leads to considerable increases efficiency gains without reducing accuracy. Screening programs should take advantage of prescreeners based on gating items or decision trees to reduce the burden on respondents. © 2013 Elsevier B.V. All rights reserved.

  9. A Simple Method to Reduce both Lactic Acid and Ammonium Production in Industrial Animal Cell Culture

    PubMed Central

    Freund, Nathaniel W.; Croughan, Matthew S.

    2018-01-01

    Fed-batch animal cell culture is the most common method for commercial production of recombinant proteins. However, higher cell densities in these platforms are still limited due to factors such as excessive ammonium production, lactic acid production, nutrient limitation, and/or hyperosmotic stress related to nutrient feeds and base additions to control pH. To partly overcome these factors, we investigated a simple method to reduce both ammonium and lactic acid production—termed Lactate Supplementation and Adaptation (LSA) technology—through the use of CHO cells adapted to a lactate-supplemented medium. Using this simple method, we achieved a reduction of nearly 100% in lactic acid production with a simultaneous 50% reduction in ammonium production in batch shaker flasks cultures. In subsequent fed-batch bioreactor cultures, lactic acid production and base addition were both reduced eight-fold. Viable cell densities of 35 million cells per mL and integral viable cell days of 273 million cell-days per mL were achieved, both among the highest currently reported for a fed-batch animal cell culture. Investigating the benefits of LSA technology in animal cell culture is worthy of further consideration and may lead to process conditions more favorable for advanced industrial applications. PMID:29382079

  10. A Simple Method to Reduce both Lactic Acid and Ammonium Production in Industrial Animal Cell Culture.

    PubMed

    Freund, Nathaniel W; Croughan, Matthew S

    2018-01-28

    Fed-batch animal cell culture is the most common method for commercial production of recombinant proteins. However, higher cell densities in these platforms are still limited due to factors such as excessive ammonium production, lactic acid production, nutrient limitation, and/or hyperosmotic stress related to nutrient feeds and base additions to control pH. To partly overcome these factors, we investigated a simple method to reduce both ammonium and lactic acid production-termed Lactate Supplementation and Adaptation (LSA) technology-through the use of CHO cells adapted to a lactate-supplemented medium. Using this simple method, we achieved a reduction of nearly 100% in lactic acid production with a simultaneous 50% reduction in ammonium production in batch shaker flasks cultures. In subsequent fed-batch bioreactor cultures, lactic acid production and base addition were both reduced eight-fold. Viable cell densities of 35 million cells per mL and integral viable cell days of 273 million cell-days per mL were achieved, both among the highest currently reported for a fed-batch animal cell culture. Investigating the benefits of LSA technology in animal cell culture is worthy of further consideration and may lead to process conditions more favorable for advanced industrial applications.

  11. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  12. Effect of School Based Treatment on the Prevalence of Schistosomiasis in Endemic Area in Yemen

    PubMed Central

    Abdulrab, A; Salem, A; Algobati, F; Saleh, S; Shibani, K; Albuthigi, R

    2013-01-01

    Background Schistosomiasis and soil transmitted infection is a major health problem of children from rural areas of developing countries including Yemen. In an attempt to reduce this burden, the Ministry of Public Health and Population in Yemen established in 2002 a programme for Schistosomal, soil transmitted infection control that aimed to reduce morbidity and prevalence rates of Schistosomiasis, and Soil transmitted helminthes to less than 5% by 2015. The study was conducted to assess the current prevalence and intensity of schistosomal infection among schoolchildren in rural areas of the Taiz governorate after 6 years of running National Control Programme. Methods Grade 3 schoolchildren from Shara'b Al-Raona district of Taiz Governorate were examined for infections with Schistosoma mansoni using Modified Kato–Katz method and S. haematobium applying filtration method in 1998/1999, comparing the prevalence and intensity of infection with base line study, which was done 6 years ago. Results The S. mansoni prevalence in the study population was 31%, while the prevalence of S. haematobium was 18.6%. This result considerably is similar to the prevalence of base line study. The intensity of mild, moderate and severe infection for S. mansoni reached to 15.9%, 60.6% & 23.5% respectively. The severity of S. haematobium infection was 68.4%. It was exceptionally found that the prevalence of S. haematobium is increased. Conclusion The high prevalence of schistosomiasis and low effectiveness of control programme against schistosomal infection in the study area demands consideration of alternative treatment approaches. PMID:23914234

  13. Protecting privacy of shared epidemiologic data without compromising analysis potential.

    PubMed

    Cologne, John; Grant, Eric J; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki

    2012-01-01

    Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.

  14. Active material for fiber core made by powder-in-tube method: subsequent homogenization by means of stack-and-draw technique

    NASA Astrophysics Data System (ADS)

    Velmiskin, Vladimir V.; Egorova, Olga N.; Mishkin, Vladimir; Nishchev, Konstantin; Semjonov, Sergey L.

    2012-04-01

    A procedure for the preparation of optically homogeneous glass for fiber preforms through sintering of coarse oxide particles and further processing of the resultant glass, including several drawing and stacking steps, is described. Reducing the pressure to 10-2 Torr during sintering considerably reduced the amount of gas bubbles in Yb/Al-doped silica glass and decreased the background loss to 100 dB/km after the third drawing-stacking-consolidation cycle. For comparison, a fiber singly doped with alumina was fabricated by the same procedure as above. The level of wavelength- independent losses in that fiber was 65 dB/km.

  15. Finite-time H∞ control for linear continuous system with norm-bounded disturbance

    NASA Astrophysics Data System (ADS)

    Meng, Qingyi; Shen, Yanjun

    2009-04-01

    In this paper, the definition of finite-time H∞ control is presented. The system under consideration is subject to time-varying norm-bounded exogenous disturbance. The main aim of this paper is focused on the design a state feedback controller which ensures that the closed-loop system is finite-time bounded (FTB) and reduces the effect of the disturbance input on the controlled output to a prescribed level. A sufficient condition is presented for the solvability of this problem, which can be reduced to a feasibility problem involving linear matrix inequalities (LMIs). A detailed solving method is proposed for the restricted linear matrix inequalities. Finally, examples are given to show the validity of the methodology.

  16. Evaluation of a continuous regimen of levonorgestrel/ethinyl estradiol for contraception and control of menstrual symptoms.

    PubMed

    Jensen, Jeffrey T; Archer, David F

    2008-02-01

    Considerable recent interest has focused on new methods of delivery of oral contraceptives that reduce or eliminate the hormone-free interval in order to improve convenience and acceptability, but maintain contraceptive efficacy, minimize side effects and reduce or eliminate the frequency of withdrawal bleeding episodes. Studies in several countries, including the US, have documented that many women would prefer to have no episodes of withdrawal bleeding when using oral contraceptives. This review focuses on a unique oral contraceptive formulation containing levonogestrel 90 microg and ethinyl estradiol 20 microg, approved for use in a continuous dosing regimen designed to eliminate withdrawal bleeding throughout the entire year.

  17. A scalable and accurate method for classifying protein-ligand binding geometries using a MapReduce approach.

    PubMed

    Estrada, T; Zhang, B; Cicotti, P; Armen, R S; Taufer, M

    2012-07-01

    We present a scalable and accurate method for classifying protein-ligand binding geometries in molecular docking. Our method is a three-step process: the first step encodes the geometry of a three-dimensional (3D) ligand conformation into a single 3D point in the space; the second step builds an octree by assigning an octant identifier to every single point in the space under consideration; and the third step performs an octree-based clustering on the reduced conformation space and identifies the most dense octant. We adapt our method for MapReduce and implement it in Hadoop. The load-balancing, fault-tolerance, and scalability in MapReduce allow screening of very large conformation spaces not approachable with traditional clustering methods. We analyze results for docking trials for 23 protein-ligand complexes for HIV protease, 21 protein-ligand complexes for Trypsin, and 12 protein-ligand complexes for P38alpha kinase. We also analyze cross docking trials for 24 ligands, each docking into 24 protein conformations of the HIV protease, and receptor ensemble docking trials for 24 ligands, each docking in a pool of HIV protease receptors. Our method demonstrates significant improvement over energy-only scoring for the accurate identification of native ligand geometries in all these docking assessments. The advantages of our clustering approach make it attractive for complex applications in real-world drug design efforts. We demonstrate that our method is particularly useful for clustering docking results using a minimal ensemble of representative protein conformational states (receptor ensemble docking), which is now a common strategy to address protein flexibility in molecular docking. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Test methods for estimating the efficacy of the fast-acting disinfectant peracetic acid on surfaces of personal protective equipment.

    PubMed

    Lemmer, K; Howaldt, S; Heinrich, R; Roder, A; Pauli, G; Dorner, B G; Pauly, D; Mielke, M; Schwebke, I; Grunow, R

    2017-11-01

    The work aimed at developing and evaluating practically relevant methods for testing of disinfectants on contaminated personal protective equipment (PPE). Carriers were prepared from PPE fabrics and contaminated with Bacillus subtilis spores. Peracetic acid (PAA) was applied as a suitable disinfectant. In method 1, the contaminated carrier was submerged in PAA solution; in method 2, the contaminated area was covered with PAA; and in method 3, PAA, preferentially combined with a surfactant, was dispersed as a thin layer. In each method, 0·5-1% PAA reduced the viability of spores by a factor of ≥6 log 10 within 3 min. The technique of the most realistic method 3 proved to be effective at low temperatures and also with a high organic load. Vaccinia virus and Adenovirus were inactivated with 0·05-0·1% PAA by up to ≥6 log 10 within 1 min. The cytotoxicity of ricin was considerably reduced by 2% PAA within 15 min of exposure. PAA/detergent mixture enabled to cover hydrophobic PPE surfaces with a thin and yet effective disinfectant layer. The test methods are objective tools for estimating the biocidal efficacy of disinfectants on hydrophobic flexible surfaces. © 2017 The Society for Applied Microbiology.

  19. Qualitative Research for Patient Safety Using ICTs: Methodological Considerations in the Technological Age.

    PubMed

    Yee, Kwang Chien; Wong, Ming Chao; Turner, Paul

    2017-01-01

    Considerable effort and resources have been dedicated to improving the quality and safety of patient care through health information systems, but there is still significant scope for improvement. One contributing factor to the lack of progress in patient safety improvement especially where technology has been deployed relates to an over-reliance on purely objective, quantitative, positivist research paradigms as the basis for generating and validating evidence of improvement. This paper argues the need for greater recognition and accommodation of evidence of improvement generated through more subjective, qualitative and pragmatic research paradigms to aid patient safety especially where technology is deployed. This paper discusses how acknowledging the role and value of more subjective ontologies and pragmatist epistemologies can support improvement science research. This paper illustrates some challenges and benefits from adopting qualitative research methods in patient safety improvement projects, particularly focusing challenges in the technological era. While adopting methods that can more readily capture, analyse and interpret direct user experiences, attitudes, insights and behaviours in their contextual settings, patient safety can be enhanced 'on the ground' and errors reduced and/or mitigated, challenges of using these methods with the younger "technologically-centred" healthcare professionals and patients needs to recognised.

  20. 78 FR 52854 - Use of Differential Income Stream as an Application of the Income Method and as a Consideration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... Differential Income Stream as an Application of the Income Method and as a Consideration in Assessing the Best Method AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Final regulations and removal of... differential income stream as a consideration in assessing the best method in connection with a cost sharing...

  1. 76 FR 80309 - Use of Differential Income Stream as an Application of the Income Method and as a Consideration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... Use of Differential Income Stream as an Application of the Income Method and as a Consideration in Assessing the Best Method AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Notice of proposed... guidance on how an analysis of the differential income stream may provide a best method consideration for...

  2. A New Method for Incremental Testing of Finite State Machines

    NASA Technical Reports Server (NTRS)

    Pedrosa, Lehilton Lelis Chaves; Moura, Arnaldo Vieira

    2010-01-01

    The automatic generation of test cases is an important issue for conformance testing of several critical systems. We present a new method for the derivation of test suites when the specification is modeled as a combined Finite State Machine (FSM). A combined FSM is obtained conjoining previously tested submachines with newly added states. This new concept is used to describe a fault model suitable for incremental testing of new systems, or for retesting modified implementations. For this fault model, only the newly added or modified states need to be tested, thereby considerably reducing the size of the test suites. The new method is a generalization of the well-known W-method and the G-method, but is scalable, and so it can be used to test FSMs with an arbitrarily large number of states.

  3. A Method of Dynamic Extended Reactive Power Optimization in Distribution Network Containing Photovoltaic-Storage System

    NASA Astrophysics Data System (ADS)

    Wang, Wu; Huang, Wei; Zhang, Yongjun

    2018-03-01

    The grid-integration of Photovoltaic-Storage System brings some undefined factors to the network. In order to make full use of the adjusting ability of Photovoltaic-Storage System (PSS), this paper puts forward a reactive power optimization model, which are used to construct the objective function based on power loss and the device adjusting cost, including energy storage adjusting cost. By using Cataclysmic Genetic Algorithm to solve this optimization problem, and comparing with other optimization method, the result proved that: the method of dynamic extended reactive power optimization this article puts forward, can enhance the effect of reactive power optimization, including reducing power loss and device adjusting cost, meanwhile, it gives consideration to the safety of voltage.

  4. Second Generation International Space Station (ISS) Total Organic Carbon Analyzer (TOCA) Verification Testing and On-Orbit Performance Results

    NASA Technical Reports Server (NTRS)

    Bentley, Nicole L.; Thomas, Evan A.; VanWie, Michael; Morrison, Chad; Stinson, Richard G.

    2010-01-01

    The Total Organic Carbon Analyzer (TOGA) is designed to autonomously determine recovered water quality as a function of TOC. The current TOGA has been on the International Space Station since November 2008. Functional checkout and operations revealed complex operating considerations. Specifically, failure of the hydrogen catalyst resulted in the development of an innovative oxidation analysis method. This method reduces the activation time and limits the hydrogen produced during analysis, while retaining the ability to indicate TOC concentrations within 25% accuracy. Subsequent testing and comparison to archived samples returned from the Station and tested on the ground yield high confidence in this method, and in the quality of the recovered water.

  5. Influence of conventional and ultrasonic-assisted extraction on phenolic contents, betacyanin contents, and antioxidant capacity of red dragon fruit (Hylocereus polyrhizus).

    PubMed

    Ramli, Nurul Shazini; Ismail, Patimah; Rahmat, Asmah

    2014-01-01

    The aim of this study was to examine the effects of extraction methods on antioxidant capacities of red dragon fruit peel and flesh. Antioxidant capacities were measured using ethylenebenzothiozoline-6-sulfonic acid (ABTS) radical cation assay and ferric reducing antioxidant power assay (FRAP). Total phenolic content (TPC) was determined using Folin-Ciocalteu reagent while quantitative determination of total flavonoid content (TFC) was conducted using aluminium trichloride colorimetric method. Betacyanin content (BC) was measured by spectrophotometer. Red dragon fruit was extracted using conventional (CV) and ultrasonic-assisted extraction (UE) technique to determine the most efficient way of extracting its antioxidant components. Results indicated that UE increased TFC, reduced the extraction yield, BC, and TPC, but exhibited the strongest scavenging activity for the peel of red dragon fruit. In contrast, UE reduced BC, TFC, and scavenging activity but increased the yield for the flesh. Nonetheless, UE slightly increases TPC in flesh. Scavenging activity and reducing power were highly correlated with phenolic and flavonoid compounds. Conversely, the scavenging activity and reducing power were weakly correlated with betacyanin content. This work gives scientific evidences for the consideration of the type of extraction techniques for the peel and flesh of red dragon fruit in applied research and food industry.

  6. Optimal selection of biochars for remediating metals ...

    EPA Pesticide Factsheets

    Approximately 500,000 abandoned mines across the U.S. pose a considerable, pervasive risk to human health and the environment due to possible exposure to the residuals of heavy metal extraction. Historically, a variety of chemical and biological methods have been used to reduce the bioavailability of the metals at mine sites. Biochar with its potential to complex and immobilize heavy metals, is an emerging alternative for reducing bioavailability. Furthermore, biochar has been reported to improve soil conditions for plant growth and can be used for promoting the establishment of a soil-stabilizing native plant community to reduce offsite movement of metal-laden waste materials. Because biochar properties depend upon feedstock selection, pyrolysis production conditions, and activation procedures used, they can be designed to meet specific remediation needs. As a result biochar with specific properties can be produced to correspond to specific soil remediation situations. However, techniques are needed to optimally match biochar characteristics with metals contaminated soils to effectively reduce metal bioavailability. Here we present experimental results used to develop a generalized method for evaluating the ability of biochar to reduce metals in mine spoil soil from an abandoned Cu and Zn mine. Thirty-eight biochars were produced from approximately 20 different feedstocks and produced via slow pyrolysis or gasification, and were allowed to react with a f

  7. Influence of Conventional and Ultrasonic-Assisted Extraction on Phenolic Contents, Betacyanin Contents, and Antioxidant Capacity of Red Dragon Fruit (Hylocereus polyrhizus)

    PubMed Central

    Ramli, Nurul Shazini; Ismail, Patimah; Rahmat, Asmah

    2014-01-01

    The aim of this study was to examine the effects of extraction methods on antioxidant capacities of red dragon fruit peel and flesh. Antioxidant capacities were measured using ethylenebenzothiozoline-6-sulfonic acid (ABTS) radical cation assay and ferric reducing antioxidant power assay (FRAP). Total phenolic content (TPC) was determined using Folin-Ciocalteu reagent while quantitative determination of total flavonoid content (TFC) was conducted using aluminium trichloride colorimetric method. Betacyanin content (BC) was measured by spectrophotometer. Red dragon fruit was extracted using conventional (CV) and ultrasonic-assisted extraction (UE) technique to determine the most efficient way of extracting its antioxidant components. Results indicated that UE increased TFC, reduced the extraction yield, BC, and TPC, but exhibited the strongest scavenging activity for the peel of red dragon fruit. In contrast, UE reduced BC, TFC, and scavenging activity but increased the yield for the flesh. Nonetheless, UE slightly increases TPC in flesh. Scavenging activity and reducing power were highly correlated with phenolic and flavonoid compounds. Conversely, the scavenging activity and reducing power were weakly correlated with betacyanin content. This work gives scientific evidences for the consideration of the type of extraction techniques for the peel and flesh of red dragon fruit in applied research and food industry. PMID:25379555

  8. Explicit solutions of a gravity-induced film flow along a convectively heated vertical wall.

    PubMed

    Raees, Ammarah; Xu, Hang

    2013-01-01

    The gravity-driven film flow has been analyzed along a vertical wall subjected to a convective boundary condition. The Boussinesq approximation is applied to simplify the buoyancy term, and similarity transformations are used on the mathematical model of the problem under consideration, to obtain a set of coupled ordinary differential equations. Then the reduced equations are solved explicitly by using homotopy analysis method (HAM). The resulting solutions are investigated for heat transfer effects on velocity and temperature profiles.

  9. Overview of Considerations in Assessing the Biomass Potential of Army Installations.

    DTIC Science & Technology

    1981-08-01

    stage. Will the species grow well in poor soils and on harsh, open sites? Trees that met these standards were then grouped according to their...frequency of fire, (2) reduces the need of fire control methods such as controlled burns, and (3) makes site preparation easier.21 Whole-tree chipping...the "aesthetic" value of the stand is increased.22 The negative effect most often thought to occur with whole-tree chipping is loss of soil nutrients

  10. Engine dynamic analysis with general nonlinear finite element codes

    NASA Technical Reports Server (NTRS)

    Adams, M. L.; Padovan, J.; Fertis, D. G.

    1991-01-01

    A general engine dynamic analysis as a standard design study computational tool is described for the prediction and understanding of complex engine dynamic behavior. Improved definition of engine dynamic response provides valuable information and insights leading to reduced maintenance and overhaul costs on existing engine configurations. Application of advanced engine dynamic simulation methods provides a considerable cost reduction in the development of new engine designs by eliminating some of the trial and error process done with engine hardware development.

  11. In Silico Genome Mismatch Scanning to Map Breast Cancer Genes in Extended Pedigrees

    DTIC Science & Technology

    2008-07-01

    University College London Annals of Human Genetics (2008) 72,279–287 279 A. Thomas et al. Methods IBD sharing in pedigrees There is considerable literature...is sufficient to maintain interest in the region. 282 Annals of Human Genetics (2008) 72,279–287 C© 2007 The Authors Journal compilation C© 2007...for observed IBS instead of IBD, and for sporadic cases reducing the number of meioses, pedigrees with meiosis count d in the 25 to 30 range are

  12. Detectability limit and uncertainty considerations for laser induced fluorescence spectroscopy in flames

    NASA Technical Reports Server (NTRS)

    Daily, J. W.

    1978-01-01

    Laser induced fluorescence spectroscopy of flames is discussed, and derived uncertainty relations are used to calculate detectability limits due to statistical errors. Interferences due to Rayleigh scattering from molecules as well as Mie scattering and incandescence from particles have been examined for their effect on detectability limits. Fluorescence trapping is studied, and some methods for reducing the effect are considered. Fluorescence trapping places an upper limit on the number density of the fluorescing species that can be measured without signal loss.

  13. On the matter of building high-frequency amplifiers minimally influenced by interstage stray reactances

    NASA Astrophysics Data System (ADS)

    A, Volkov Y.

    2017-01-01

    The expedience of building wideband multistage amplifiers, the stages of which are connected with each other so, that the “modes of impedance mismatch” are realized, is justified. Those modes allow us to reduce considerably the sensitivity of amplifier transfer factors to the stray (constructional) capacitances and inductances of interstage circuits. The procedure of synthesizing the schematics of such amplifiers is proposed, the efficiency and clarity of which are provided by using the method of signal graphs.

  14. Cubic spline numerical solution of an ablation problem with convective backface cooling

    NASA Astrophysics Data System (ADS)

    Lin, S.; Wang, P.; Kahawita, R.

    1984-08-01

    An implicit numerical technique using cubic splines is presented for solving an ablation problem on a thin wall with convective cooling. A non-uniform computational mesh with 6 grid points has been used for the numerical integration. The method has been found to be computationally efficient, providing for the care under consideration of an overall error of about 1 percent. The results obtained indicate that the convective cooling is an important factor in reducing the ablation thickness.

  15. Thermal Considerations for Reducing the Cooldown and Warmup Duration of the James Webb Space Telescope OTIS Cryo-Vacuum Test

    NASA Technical Reports Server (NTRS)

    Yang, Kan; Glazer, Stuart; Ousley, Wes; Burt, William

    2017-01-01

    The James Webb Space Telescope (JWST), set to launch in 2018, is NASAs next-generation flagship telescope. The Optical Telescope Element (OTE) and Integrated Science Instrument Module (ISIM) contain all of the optical surfaces and instruments to capture and analyze the telescopes infrared targets. The integrated OTE and ISIM are denoted as OTIS, and will be tested as a single unit in a critical thermal-vacuum test in mid-2017 at NASA Johnson Space Centers Chamber A facility. The payload will be evaluated for workmanship and functionality in a 20K simulated flight environment during this thermal-vacuum test. However, the sheer thermal mass of the OTIS payload as well as the restrictive gradient, rate, and contamination-related constraints placed on test components precludes rapid cooldown or warmup to its steady-state cryo-balance condition. Hardware safety considerations precludes injection of helium gas for free molecular heat transfer. Initial thermal analysis predicted that transient radiative cooldown from ambient temperatures, while meeting all limits and constraints, would take 33.3 days; warmup similarly would take 28.4 days. This paper discusses methods used to reduce transition times from the original predictions through modulation of boundary temperatures and environmental conditions. By optimizing helium shroud transition rates and heater usage, as well as rigorously re-examining previously imposed constraints, savings of up to three days on cooldown and up to a week on warmup can be achieved. The efficiencies gained through these methods allow the JWST thermal test team to create faster cooldown and warmup profiles, thus reducing the overall test duration and cost, while keeping all of the required test operations.

  16. Bio-diesel production directly from the microalgae biomass of Nannochloropsis by microwave and ultrasound radiation.

    PubMed

    Koberg, Miri; Cohen, Moshe; Ben-Amotz, Ami; Gedanken, Aharon

    2011-03-01

    This work offers an optimized method for the direct conversion of harvested Nannochloropsis algae into bio-diesel using two novel techniques. The first is a unique bio-technology-based environmental system utilizing flue gas from coal burning power stations for microalgae cultivation. This method reduces considerably the cost of algae production. The second technique is the direct transesterification (a one-stage method) of the Nannochloropsis biomass to bio-diesel production using microwave and ultrasound radiation with the aid of a SrO catalyst. These two techniques were tested and compared to identify the most effective bio-diesel production method. Based on our results, it is concluded that the microwave oven method appears to be the most simple and efficient method for the one-stage direct transesterification of the as-harvested Nannochloropsis algae. Copyright © 2010 Elsevier Ltd. All rights reserved.

  17. Improved look-up table method of computer-generated holograms.

    PubMed

    Wei, Hui; Gong, Guanghong; Li, Ni

    2016-11-10

    Heavy computation load and vast memory requirements are major bottlenecks of computer-generated holograms (CGHs), which are promising and challenging in three-dimensional displays. To solve these problems, an improved look-up table (LUT) method suitable for arbitrarily sampled object points is proposed and implemented on a graphics processing unit (GPU) whose reconstructed object quality is consistent with that of the coherent ray-trace (CRT) method. The concept of distance factor is defined, and the distance factors are pre-computed off-line and stored in a look-up table. The results show that while reconstruction quality close to that of the CRT method is obtained, the on-line computation time is dramatically reduced compared with the LUT method on the GPU and the memory usage is lower than that of the novel-LUT considerably. Optical experiments are carried out to validate the effectiveness of the proposed method.

  18. 24 CFR 1.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., and the term State means any one of the foregoing. (e) The term Federal financial assistance includes... without consideration or at a nominal consideration, or at a consideration which is reduced for the... of its purposes the provision of assistance. The term Federal financial assistance does not include a...

  19. 32 CFR 195.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... term Federal financial assistance includes: (1) Grants and loans of Federal funds, (2) The grant or... or any interest in such property without consideration or at a nominal consideration, or at a consideration which is reduced for the purpose of assisting the recipient, or in recognition of the public...

  20. Cupping - is it reproducible? Experiments about factors determining the vacuum.

    PubMed

    Huber, R; Emerich, M; Braeunig, M

    2011-04-01

    Cupping is a traditional method for treating pain which is investigated nowadays in clinical studies. Because the methods for producing the vacuum vary considerably we tested their reproducibility. In a first set of experiments (study 1) four methods for producing the vacuum (lighter flame 2 cm (LF1), lighter flame 4 cm (LF2), alcohol flame (AF) and mechanical suction with a balloon (BA)) have been compared in 50 trials each. The cupping glass was prepared with an outlet and stop-cock, the vacuum was measured with a pressure-gauge after the cup was set to a soft rubber pad. In a second series of experiments (study 2) we investigated the stability of pressures in 20 consecutive trials in two experienced cupping practitioners and ten beginners using method AF. In study 1 all four methods yielded consistent pressures. Large differences in magnitude were, however, observed between methods (mean pressures -200±30 hPa with LF1, -310±30 hPa with LF2, -560±30 hPa with AF, and -270±16 hPa with BA). With method BA the standard deviation was reduced by a factor 2 compared to the flame methods. In study 2 beginners had considerably more difficulty obtaining a stable pressure yield than advanced cupping practitioners, showing a distinct learning curve before reaching expertise levels after about 10-20 trials. Cupping is reproducible if the exact method is described in detail. Mechanical suction with a balloon has the best reproducibility. Beginners need at least 10-20 trials to produce stable pressures. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Shape Measurement by Means of Phase Retrieval using a Spatial Light Modulator

    NASA Astrophysics Data System (ADS)

    Agour, Mostafa; Huke, Philipp; Kopylow, Christoph V.; Falldorf, Claas

    2010-04-01

    We present a novel approach to investigate the shape of a diffusely reflecting technical object. It is based on a combination of a multiple-illumination contouring procedure and phase retrieval from a set of intensity measurements. Special consideration is given to the design of the experimental configuration for phase retrieval and the iterative algorithm to extract the 3D phase map. It is mainly based on a phase-only spatial light modulator (SLM) in the Fourier domain of a 4f-imaging system. The SLM is used to modulate the light incident in the Fourier plane with the transfer function of propagation. Thus, a set of consecutive intensity measurements of the wave field scattered by the investigated object in various propagation states can be realized in a common recording plane. In contrast to already existing methods, no mechanical adjustment is required during the recording process and thus the measuring time is considerably reduced. The method is applied to investigate the shape of micro-objects obtained from a metalforming process. Finally, the experimental results are compared to those provided by a standard interferometric contouring procedure.

  2. An adaptive radiotherapy planning strategy for bladder cancer using deformation vector fields.

    PubMed

    Vestergaard, Anne; Kallehauge, Jesper Folsted; Petersen, Jørgen Breede Baltzer; Høyer, Morten; Søndergaard, Jimmi; Muren, Ludvig Paul

    2014-09-01

    Adaptive radiotherapy (ART) has considerable potential in treatment of bladder cancer due to large inter-fractional changes in shape and size of the target. The aim of this study was to compare our clinically applied method for plan library creation that involves manual bladder delineations (Clin-ART) with a method using the deformation vector fields (DVFs) resulting from intensity-based deformable image registrations (DVF-based ART). The study included thirteen patients with urinary bladder cancer who had daily cone beam CTs (CBCTs) acquired for set-up. In both ART strategies investigated, three plan selection volumes were generated using the CBCTs from the first four fractions; in Clin-ART boolean combinations of delineated bladders were used, while the DVF-based strategy applied combinations of the mean and standard deviation of patient-specific DVFs. The volume ratios (VRs) of the course-averaged PTV for the two ART strategies relative the non-adaptive PTV were calculated. Both Clin-ART and DVF-based ART considerably reduced the course-averaged PTV, compared to non-adaptive RT. The VR for DVF-based ART was lower than for Clin-ART (0.65 vs. 0.73; p<0.01). DVF-based ART for bladder irradiation has a considerable normal tissue sparing potential surpassing our already highly conformal clinically applied ART strategy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Technical and conceptual considerations for using animated stimuli in studies of animal behavior.

    PubMed

    Chouinard-Thuly, Laura; Gierszewski, Stefanie; Rosenthal, Gil G; Reader, Simon M; Rieucau, Guillaume; Woo, Kevin L; Gerlai, Robert; Tedore, Cynthia; Ingley, Spencer J; Stowers, John R; Frommen, Joachim G; Dolins, Francine L; Witte, Klaudia

    2017-02-01

    Rapid technical advances in the field of computer animation (CA) and virtual reality (VR) have opened new avenues in animal behavior research. Animated stimuli are powerful tools as they offer standardization, repeatability, and complete control over the stimulus presented, thereby "reducing" and "replacing" the animals used, and "refining" the experimental design in line with the 3Rs. However, appropriate use of these technologies raises conceptual and technical questions. In this review, we offer guidelines for common technical and conceptual considerations related to the use of animated stimuli in animal behavior research. Following the steps required to create an animated stimulus, we discuss (I) the creation, (II) the presentation, and (III) the validation of CAs and VRs. Although our review is geared toward computer-graphically designed stimuli, considerations on presentation and validation also apply to video playbacks. CA and VR allow both new behavioral questions to be addressed and existing questions to be addressed in new ways, thus we expect a rich future for these methods in both ultimate and proximate studies of animal behavior.

  4. Reduction of bias and variance for evaluation of computer-aided diagnostic schemes.

    PubMed

    Li, Qiang; Doi, Kunio

    2006-04-01

    Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists in detecting various lesions in medical images. In addition to the development, an equally important problem is the reliable evaluation of the performance levels of various CAD schemes. It is good to see that more and more investigators are employing more reliable evaluation methods such as leave-one-out and cross validation, instead of less reliable methods such as resubstitution, for assessing their CAD schemes. However, the common applications of leave-one-out and cross-validation evaluation methods do not necessarily imply that the estimated performance levels are accurate and precise. Pitfalls often occur in the use of leave-one-out and cross-validation evaluation methods, and they lead to unreliable estimation of performance levels. In this study, we first identified a number of typical pitfalls for the evaluation of CAD schemes, and conducted a Monte Carlo simulation experiment for each of the pitfalls to demonstrate quantitatively the extent of bias and/or variance caused by the pitfall. Our experimental results indicate that considerable bias and variance may exist in the estimated performance levels of CAD schemes if one employs various flawed leave-one-out and cross-validation evaluation methods. In addition, for promoting and utilizing a high standard for reliable evaluation of CAD schemes, we attempt to make recommendations, whenever possible, for overcoming these pitfalls. We believe that, with the recommended evaluation methods, we can considerably reduce the bias and variance in the estimated performance levels of CAD schemes.

  5. Life cycle tools combined with multi-criteria and participatory methods for agricultural sustainability: Insights from a systematic and critical review.

    PubMed

    De Luca, Anna Irene; Iofrida, Nathalie; Leskinen, Pekka; Stillitano, Teodora; Falcone, Giacomo; Strano, Alfio; Gulisano, Giovanni

    2017-10-01

    Life cycle (LC) methodologies have attracted a great interest in agricultural sustainability assessments, even if, at the same time, they have sometimes been criticized for making unrealistic assumptions and subjective choices. To cope with these weaknesses, Multi-Criteria Decision Analysis (MCDA) and/or participatory methods can be used to balance and integrate different sustainability dimensions. The purpose of this study is to highlight how life cycle approaches were combined with MCDA and participatory methods to address agricultural sustainability in the published scientific literature. A systematic and critical review was developed, highlighting the following features: which multi-criterial and/or participatory methods have been associated with LC tools; how they have been integrated or complemented (methodological relationships); the intensity of the involvement of stakeholders (degree of participation); and which synergies have been achieved by combining the methods. The main typology of integration was represented by multi-criterial frameworks integrating LC evaluations. LC tools can provide MCDA studies with local and global information on how to reduce negative impacts and avoid burden shifts, while MCDA methods can help LC practitioners deal with subjective assumptions in an objective way, to take into consideration actors' values and to overcome trade-offs among the different dimensions of sustainability. Considerations concerning the further development of Life Cycle Sustainability Assessment (LCSA) have been identified as well. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Special Considerations for Mass Violence Events in Senior Living Facilities: A Case Report on the Pinelake Health and Rehab Center Shooting.

    PubMed

    Martin, Cody; Powell, David

    2017-02-01

    The 2009 Pinelake Health and Rehab Center shooting in Carthage, North Carolina, presents a unique case study for examining the specific considerations for mass violence events in senior living facilities. A variety of factors, including reduced sensory perception, reduced mobility, and cognitive decline, may increase the vulnerability of the populations of senior living facilities during mass violence events. Management of response aspects such as evacuation, relocation, and reunification also require special consideration in the context of mass violence at senior living facilities. Better awareness of these vulnerabilities and response considerations can assist facility administrators and emergency managers when preparing for potential mass violence events at senior living facilities. (Disaster Med Public Health Preparedness. 2017;11:150-152).

  7. Current and future contraceptive options for women living with HIV

    PubMed Central

    Patel, Rena C.; Bukusi, Elizabeth A.; Baeten, Jared M.

    2018-01-01

    Introduction Among women living with HIV, half of the pregnancies are unintended. Effective contraception can prevent unintended pregnancies and consequently reduce maternal mortality and perinatal transmission of HIV. While contraceptive options available for all women also apply to women living with HIV, specific considerations exist to the use of contraception by women living with HIV. Areas covered First, general principles guiding the use of contraception among women living with HIV are discussed, such as choice, method mix, relative effectiveness, and drug-drug interactions. Second, a detailed discussion of each contraceptive method and issues surrounding the use of that method, such as drug-drug interactions, follows. Third, future contraceptive options in advanced development for use by women or men are briefly discussed. Expert opinion Contraceptive methods available to all women should also be accessible to women living with HIV. When the relative effectiveness of a contraceptive method is reduced, for example due to drug-drug interactions with antiretrovirals, the method should still be made available to women living with HIV with the appropriate information sharing and counseling. Greater research on various aspects of contraceptive use by women living with HIV and more comprehensive testing of co-administration of hormonal contraceptives and common medications used by these women are warranted. PMID:28891343

  8. Current and future contraceptive options for women living with HIV.

    PubMed

    Patel, Rena C; Bukusi, Elizabeth A; Baeten, Jared M

    2018-01-01

    Among women living with HIV, half of the pregnancies are unintended. Effective contraception can prevent unintended pregnancies and consequently reduce maternal mortality and perinatal transmission of HIV. While contraceptive options available for all women also apply to women living with HIV, specific considerations exist to the use of contraception by women living with HIV. Areas covered: First, general principles guiding the use of contraception among women living with HIV are discussed, such as choice, method mix, relative effectiveness, and drug-drug interactions. Second, a detailed discussion of each contraceptive method and issues surrounding the use of that method, such as drug-drug interactions, follows. Third, future contraceptive options in advanced development for use by women or men are briefly discussed. Expert opinion: Contraceptive methods available to all women should also be accessible to women living with HIV. When the relative effectiveness of a contraceptive method is reduced, for example due to drug-drug interactions with antiretrovirals, the method should still be made available to women living with HIV with the appropriate information sharing and counseling. Greater research on various aspects of contraceptive use by women living with HIV and more comprehensive testing of co-administration of hormonal contraceptives and common medications used by these women are warranted.

  9. Optimal subsystem approach to multi-qubit quantum state discrimination and experimental investigation

    NASA Astrophysics Data System (ADS)

    Xue, ShiChuan; Wu, JunJie; Xu, Ping; Yang, XueJun

    2018-02-01

    Quantum computing is a significant computing capability which is superior to classical computing because of its superposition feature. Distinguishing several quantum states from quantum algorithm outputs is often a vital computational task. In most cases, the quantum states tend to be non-orthogonal due to superposition; quantum mechanics has proved that perfect outcomes could not be achieved by measurements, forcing repetitive measurement. Hence, it is important to determine the optimum measuring method which requires fewer repetitions and a lower error rate. However, extending current measurement approaches mainly aiming at quantum cryptography to multi-qubit situations for quantum computing confronts challenges, such as conducting global operations which has considerable costs in the experimental realm. Therefore, in this study, we have proposed an optimum subsystem method to avoid these difficulties. We have provided an analysis of the comparison between the reduced subsystem method and the global minimum error method for two-qubit problems; the conclusions have been verified experimentally. The results showed that the subsystem method could effectively discriminate non-orthogonal two-qubit states, such as separable states, entangled pure states, and mixed states; the cost of the experimental process had been significantly reduced, in most circumstances, with acceptable error rate. We believe the optimal subsystem method is the most valuable and promising approach for multi-qubit quantum computing applications.

  10. LEGO: a novel method for gene set over-representation analysis by incorporating network-based gene weights

    PubMed Central

    Dong, Xinran; Hao, Yun; Wang, Xiao; Tian, Weidong

    2016-01-01

    Pathway or gene set over-representation analysis (ORA) has become a routine task in functional genomics studies. However, currently widely used ORA tools employ statistical methods such as Fisher’s exact test that reduce a pathway into a list of genes, ignoring the constitutive functional non-equivalent roles of genes and the complex gene-gene interactions. Here, we develop a novel method named LEGO (functional Link Enrichment of Gene Ontology or gene sets) that takes into consideration these two types of information by incorporating network-based gene weights in ORA analysis. In three benchmarks, LEGO achieves better performance than Fisher and three other network-based methods. To further evaluate LEGO’s usefulness, we compare LEGO with five gene expression-based and three pathway topology-based methods using a benchmark of 34 disease gene expression datasets compiled by a recent publication, and show that LEGO is among the top-ranked methods in terms of both sensitivity and prioritization for detecting target KEGG pathways. In addition, we develop a cluster-and-filter approach to reduce the redundancy among the enriched gene sets, making the results more interpretable to biologists. Finally, we apply LEGO to two lists of autism genes, and identify relevant gene sets to autism that could not be found by Fisher. PMID:26750448

  11. LEGO: a novel method for gene set over-representation analysis by incorporating network-based gene weights.

    PubMed

    Dong, Xinran; Hao, Yun; Wang, Xiao; Tian, Weidong

    2016-01-11

    Pathway or gene set over-representation analysis (ORA) has become a routine task in functional genomics studies. However, currently widely used ORA tools employ statistical methods such as Fisher's exact test that reduce a pathway into a list of genes, ignoring the constitutive functional non-equivalent roles of genes and the complex gene-gene interactions. Here, we develop a novel method named LEGO (functional Link Enrichment of Gene Ontology or gene sets) that takes into consideration these two types of information by incorporating network-based gene weights in ORA analysis. In three benchmarks, LEGO achieves better performance than Fisher and three other network-based methods. To further evaluate LEGO's usefulness, we compare LEGO with five gene expression-based and three pathway topology-based methods using a benchmark of 34 disease gene expression datasets compiled by a recent publication, and show that LEGO is among the top-ranked methods in terms of both sensitivity and prioritization for detecting target KEGG pathways. In addition, we develop a cluster-and-filter approach to reduce the redundancy among the enriched gene sets, making the results more interpretable to biologists. Finally, we apply LEGO to two lists of autism genes, and identify relevant gene sets to autism that could not be found by Fisher.

  12. Model reduction for Space Station Freedom

    NASA Technical Reports Server (NTRS)

    Williams, Trevor

    1992-01-01

    Model reduction is an important practical problem in the control of flexible spacecraft, and a considerable amount of work has been carried out on this topic. Two of the best known methods developed are modal truncation and internal balancing. Modal truncation is simple to implement but can give poor results when the structure possesses clustered natural frequencies, as often occurs in practice. Balancing avoids this problem but has the disadvantages of high computational cost, possible numerical sensitivity problems, and no physical interpretation for the resulting balanced 'modes'. The purpose of this work is to examine the performance of the subsystem balancing technique developed by the investigator when tested on a realistic flexible space structure, in this case a model of the Permanently Manned Configuration (PMC) of Space Station Freedom. This method retains the desirable properties of standard balancing while overcoming the three difficulties listed above. It achieves this by first decomposing the structural model into subsystems of highly correlated modes. Each subsystem is approximately uncorrelated from all others, so balancing them separately and then combining yields comparable results to balancing the entire structure directly. The operation count reduction obtained by the new technique is considerable: a factor of roughly r(exp 2) if the system decomposes into r equal subsystems. Numerical accuracy is also improved significantly, as the matrices being operated on are of reduced dimension, and the modes of the reduced-order model now have a clear physical interpretation; they are, to first order, linear combinations of repeated-frequency modes.

  13. Time Series Expression Analyses Using RNA-seq: A Statistical Approach

    PubMed Central

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P.

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis. PMID:23586021

  14. Misgav-Ladach cesarean section: general consideration.

    PubMed

    Fatusić, Zlatan; Hudić, Igor; Musić, Asim

    2011-03-01

    Among obstetric techniques, cesarean section seemed to represent a well-defined procedure and significant advances in this intervention were considered to be unlikely. However, obstetric surgery has recently undergone many improvements. In 1972, Joel-Cohen presented a new method for transverse incision of the abdomen. This method, with some modifications, was integrated into the Misgav-Ladach cesarean section. The philosophy of this technique is to cause the least possible damage to tissues, to refrain from superfluous steps, and to make the intervention the simplest possible. Advantages of this method are lower incidence of fever and urinary tract infection, reduced use of antibiotics and narcotics, faster re-establishment of normal bowel function, shorter maternal hospital stay and less postoperative adhesion formation. The Misgav-Ladach method of cesarean section is suitable for emergency and elective procedures, justifying its use in daily routine.

  15. Time series expression analyses using RNA-seq: a statistical approach.

    PubMed

    Oh, Sunghee; Song, Seongho; Grabowski, Gregory; Zhao, Hongyu; Noonan, James P

    2013-01-01

    RNA-seq is becoming the de facto standard approach for transcriptome analysis with ever-reducing cost. It has considerable advantages over conventional technologies (microarrays) because it allows for direct identification and quantification of transcripts. Many time series RNA-seq datasets have been collected to study the dynamic regulations of transcripts. However, statistically rigorous and computationally efficient methods are needed to explore the time-dependent changes of gene expression in biological systems. These methods should explicitly account for the dependencies of expression patterns across time points. Here, we discuss several methods that can be applied to model timecourse RNA-seq data, including statistical evolutionary trajectory index (SETI), autoregressive time-lagged regression (AR(1)), and hidden Markov model (HMM) approaches. We use three real datasets and simulation studies to demonstrate the utility of these dynamic methods in temporal analysis.

  16. Higher-order time integration of Coulomb collisions in a plasma using Langevin equations

    DOE PAGES

    Dimits, A. M.; Cohen, B. I.; Caflisch, R. E.; ...

    2013-02-08

    The extension of Langevin-equation Monte-Carlo algorithms for Coulomb collisions from the conventional Euler-Maruyama time integration to the next higher order of accuracy, the Milstein scheme, has been developed, implemented, and tested. This extension proceeds via a formulation of the angular scattering directly as stochastic differential equations in the two fixed-frame spherical-coordinate velocity variables. Results from the numerical implementation show the expected improvement [O(Δt) vs. O(Δt 1/2)] in the strong convergence rate both for the speed |v| and angular components of the scattering. An important result is that this improved convergence is achieved for the angular component of the scattering ifmore » and only if the “area-integral” terms in the Milstein scheme are included. The resulting Milstein scheme is of value as a step towards algorithms with both improved accuracy and efficiency. These include both algorithms with improved convergence in the averages (weak convergence) and multi-time-level schemes. The latter have been shown to give a greatly reduced cost for a given overall error level when compared with conventional Monte-Carlo schemes, and their performance is improved considerably when the Milstein algorithm is used for the underlying time advance versus the Euler-Maruyama algorithm. A new method for sampling the area integrals is given which is a simplification of an earlier direct method and which retains high accuracy. Lastly, this method, while being useful in its own right because of its relative simplicity, is also expected to considerably reduce the computational requirements for the direct conditional sampling of the area integrals that is needed for adaptive strong integration.« less

  17. Usage of the back-propagation method for alphabet recognition

    NASA Astrophysics Data System (ADS)

    Shaila Sree, R. N.; Eswaran, Kumar; Sundararajan, N.

    1999-03-01

    Artificial Neural Networks play a pivotal role in the branch of Artificial Intelligence. They can be trained efficiently for a variety of tasks using different methods, of which the Back Propagation method is one among them. The paper studies the choosing of various design parameters of a neural network for the Back Propagation method. The study shows that when these parameters are properly assigned, the training task of the net is greatly simplified. The character recognition problem has been chosen as a test case for this study. A sample space of different handwritten characters of the English alphabet was gathered. A Neural net is finally designed taking many the design aspects into consideration and trained for different styles of writing. Experimental results are reported and discussed. It has been found that an appropriate choice of the design parameters of the neural net for the Back Propagation method reduces the training time and improves the performance of the net.

  18. Rapid molecular diagnostics for multi-drug resistant tuberculosis in India.

    PubMed

    Ramachandran, Rajeswari; Muniyandi, M

    2018-03-01

    Rapid molecular diagnostic methods help in the detection of TB and Rifampicin resistance. These methods detect TB early, are accurate and play a crucial role in reducing the burden of drug resistant tuberculosis. Areas covered: This review analyses rapid molecular diagnostic tools used in the diagnosis of MDR-TB in India, such as the Line Probe Assay and GeneXpert. We have discussed the burden of MDR-TB and the impact of recent diagnostic tools on case detection and treatment outcomes. This review also discusses the costs involved in establishing these new techniques in India. Expert commentary: Molecular methods have considerable advantages for the programmatic management of drug resistant TB. These include speed, standardization of testing, potentially high throughput and reduced laboratory biosafety requirements. There is a desperate need for India to adopt modern, rapid, molecular tools with point-of-care tests being currently evaluated. New molecular diagnostic tests appear to be cost effective and also help in detecting missing cases. There is enough evidence to support the scaling up of these new tools in India.

  19. Motion Planning of Two Stacker Cranes in a Large-Scale Automated Storage/Retrieval System

    NASA Astrophysics Data System (ADS)

    Kung, Yiheng; Kobayashi, Yoshimasa; Higashi, Toshimitsu; Ota, Jun

    We propose a method for reducing the computational time of motion planning for stacker cranes. Most automated storage/retrieval systems (AS/RSs) are only equipped with one stacker crane. However, this is logistically challenging, and greater work efficiency in warehouses, such as those using two stacker cranes, is required. In this paper, a warehouse with two stacker cranes working simultaneously is proposed. Unlike warehouses with only one crane, trajectory planning in those with two cranes is very difficult. Since there are two cranes working together, a proper trajectory must be considered to avoid collision. However, verifying collisions is complicated and requires a considerable amount of computational time. As transport work in AS/RSs occurs randomly, motion planning cannot be conducted in advance. Planning an appropriate trajectory within a restricted duration would be a difficult task. We thereby address the current problem of motion planning requiring extensive calculation time. As a solution, we propose a “free-step” to simplify the procedure of collision verification and reduce the computational time. On the other hand, we proposed a method to reschedule the order of collision verification in order to find an appropriate trajectory in less time. By the proposed method, we reduce the calculation time to less than 1/300 of that achieved in former research.

  20. Strategic assay deployment as a method for countering analytical bottlenecks in high throughput process development: case studies in ion exchange chromatography.

    PubMed

    Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel

    2012-01-01

    High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  1. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  2. Diffusion Weighted Image Denoising Using Overcomplete Local PCA

    PubMed Central

    Manjón, José V.; Coupé, Pierrick; Concha, Luis; Buades, Antonio; Collins, D. Louis; Robles, Montserrat

    2013-01-01

    Diffusion Weighted Images (DWI) normally shows a low Signal to Noise Ratio (SNR) due to the presence of noise from the measurement process that complicates and biases the estimation of quantitative diffusion parameters. In this paper, a new denoising methodology is proposed that takes into consideration the multicomponent nature of multi-directional DWI datasets such as those employed in diffusion imaging. This new filter reduces random noise in multicomponent DWI by locally shrinking less significant Principal Components using an overcomplete approach. The proposed method is compared with state-of-the-art methods using synthetic and real clinical MR images, showing improved performance in terms of denoising quality and estimation of diffusion parameters. PMID:24019889

  3. On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.

    2003-01-01

    A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.

  4. Immersion Cooling of Electronics in DoD Installations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coles, Henry; Herrlin, Magnus

    A considerable amount of energy is consumed to cool electronic equipment in data centers. A method for substantially reducing the energy needed for this cooling was demonstrated. The method involves immersing electronic equipment in a non-conductive liquid that changes phase from a liquid to a gas. The liquid used was 3M Novec 649. Two-phase immersion cooling using this liquid is not viable at this time. The primary obstacles are IT equipment failures and costs. However, the demonstrated technology met the performance objectives for energy efficiency and greenhouse gas reduction. Before commercialization of this technology can occur, a root cause analysismore » of the failures should be completed, and the design changes proven.« less

  5. GPU Accelerated Prognostics

    NASA Technical Reports Server (NTRS)

    Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley

    2017-01-01

    Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.

  6. A structure adapted multipole method for electrostatic interactions in protein dynamics

    NASA Astrophysics Data System (ADS)

    Niedermeier, Christoph; Tavan, Paul

    1994-07-01

    We present an algorithm for rapid approximate evaluation of electrostatic interactions in molecular dynamics simulations of proteins. Traditional algorithms require computational work of the order O(N2) for a system of N particles. Truncation methods which try to avoid that effort entail untolerably large errors in forces, energies and other observables. Hierarchical multipole expansion algorithms, which can account for the electrostatics to numerical accuracy, scale with O(N log N) or even with O(N) if they become augmented by a sophisticated scheme for summing up forces. To further reduce the computational effort we propose an algorithm that also uses a hierarchical multipole scheme but considers only the first two multipole moments (i.e., charges and dipoles). Our strategy is based on the consideration that numerical accuracy may not be necessary to reproduce protein dynamics with sufficient correctness. As opposed to previous methods, our scheme for hierarchical decomposition is adjusted to structural and dynamical features of the particular protein considered rather than chosen rigidly as a cubic grid. As compared to truncation methods we manage to reduce errors in the computation of electrostatic forces by a factor of 10 with only marginal additional effort.

  7. High pressure-assisted transfer of ultraclean chemical vapor deposited graphene

    NASA Astrophysics Data System (ADS)

    Chen, Zhiying; Ge, Xiaoming; Zhang, Haoran; Zhang, Yanhui; Sui, Yanping; Yu, Guanghui; Jin, Zhi; Liu, Xinyu

    2016-03-01

    We develop a high pressure-assisted (approximately 1000 kPa) transfer method to remove polymer residues and effectively reduce damages on the surface of graphene. By introducing an ethanol pre-dehydration technique and optimizing temperature, the graphene surface becomes nearly free of residues, and the quality of graphene is improved obviously when temperature reaches 140 °C. The graphene obtained using the high pressure-assisted transfer method also exhibits excellent electrical properties with an average sheet resistance of approximately 290 Ω/sq and a mobility of 1210 cm2/V.s at room temperature. Sheet resistance and mobility are considerably improved compared with those of the graphene obtained using the normal wet transfer method (average sheet resistance of approximately 510 ohm/sq and mobility of 750 cm2/V.s).

  8. Method of Optimizing the Construction of Machining, Assembly and Control Devices

    NASA Astrophysics Data System (ADS)

    Iordache, D. M.; Costea, A.; Niţu, E. L.; Rizea, A. D.; Babă, A.

    2017-10-01

    Industry dynamics, driven by economic and social requirements, must generate more interest in technological optimization, capable of ensuring a steady development of advanced technical means to equip machining processes. For these reasons, the development of tools, devices, work equipment and control, as well as the modernization of machine tools, is the certain solution to modernize production systems that require considerable time and effort. This type of approach is also related to our theoretical, experimental and industrial applications of recent years, presented in this paper, which have as main objectives the elaboration and use of mathematical models, new calculation methods, optimization algorithms, new processing and control methods, as well as some structures for the construction and configuration of technological equipment with a high level of performance and substantially reduced costs..

  9. Noise Estimation in Electroencephalogram Signal by Using Volterra Series Coefficients

    PubMed Central

    Hassani, Malihe; Karami, Mohammad Reza

    2015-01-01

    The Volterra model is widely used for nonlinearity identification in practical applications. In this paper, we employed Volterra model to find the nonlinearity relation between electroencephalogram (EEG) signal and the noise that is a novel approach to estimate noise in EEG signal. We show that by employing this method. We can considerably improve the signal to noise ratio by the ratio of at least 1.54. An important issue in implementing Volterra model is its computation complexity, especially when the degree of nonlinearity is increased. Hence, in many applications it is urgent to reduce the complexity of computation. In this paper, we use the property of EEG signal and propose a new and good approximation of delayed input signal to its adjacent samples in order to reduce the computation of finding Volterra series coefficients. The computation complexity is reduced by the ratio of at least 1/3 when the filter memory is 3. PMID:26284176

  10. The Role of Multimodal Analgesia in Spine Surgery.

    PubMed

    Kurd, Mark F; Kreitz, Tyler; Schroeder, Gregory; Vaccaro, Alexander R

    2017-04-01

    Optimal postoperative pain control allows for faster recovery, reduced complications, and improved patient satisfaction. Historically, pain management after spine surgery relied heavily on opioid medications. Multimodal regimens were developed to reduce opioid consumption and associated adverse effects. Multimodal approaches used in orthopaedic surgery of the lower extremity, especially joint arthroplasty, have been well described and studies have shown reduced opioid consumption, improved pain and function, and decreased length of stay. A growing body of evidence supports multimodal analgesia in spine surgery. Methods include the use of preemptive analgesia, NSAIDs, the neuromodulatory agents gabapentin and pregabalin, acetaminophen, and extended-action local anesthesia. The development of a standard approach to multimodal analgesia in spine surgery requires extensive assessment of the literature. Because a substantial number of spine surgeries are performed annually, a standardized approach to multimodal analgesia may provide considerable benefits, particularly in the context of the increased emphasis on accountability within the healthcare system.

  11. Span-Load Distribution as a Factor in Stability in Roll

    NASA Technical Reports Server (NTRS)

    Knight, Montgomery; Noyes, Richard W

    1932-01-01

    This report gives the results of pressure-distribution tests made to study the effects on lateral stability of changing the span-load distribution on a rectangular monoplane wing model of fairly thick section. Three methods of changing the distribution were employed: variation in profile along the span to a thin symmetrical section at the tip, twist from +5 degrees to -15 degrees at the tip, and sweepback from +20 degrees to -20 degrees. The tests were conducted in a 5-foot closed-throat atmospheric wind tunnel. The investigation shows the following results: (1) change in profile along the span from the NACA-84 at the root to the NACA-M2 at the tip considerably reduces lateral instability, but also reduces the general effectiveness of the wing. (2) washout up to 11 degrees progressively reduces maximum lateral instability. (3) transition from sweepforward to sweepback gradually reduces the useful angle-of-attack range, but has no clearly defined effect on maximum lateral instability.

  12. Can Food be Addictive? Public Health and Policy Implications

    PubMed Central

    Gearhardt, Ashley N.; Grilo, Carlos M.; DiLeone, Ralph J.; Brownell, Kelly D.; Potenza, Marc N.

    2011-01-01

    Aims Data suggest that hyperpalatable foods may be capable of triggering an addictive process. Although the addictive potential of foods continues to be debated, important lessons learned in reducing the health and economic consequences of drug addiction may be especially useful in combating food-related problems. Methods In the current paper, we review the potential application of policy and public health approaches that have been effective in reducing the impact of addictive substances to food-related problems. Results Corporate responsibility, public health approaches, environmental change, and global efforts all warrant strong consideration in reducing obesity and diet-related disease. Conclusions Although there exist important differences between foods and addictive drugs, ignoring analogous neural and behavioral effects of foods and drugs of abuse may result in increased food-related disease and associated social and economic burdens. Public health interventions that have been effective in reducing the impact of addictive drugs may have a role in targeting obesity and related diseases. PMID:21635588

  13. Exploring the nature of stigmatising beliefs about depression and help-seeking: Implications for reducing stigma

    PubMed Central

    Barney, Lisa J; Griffiths, Kathleen M; Christensen, Helen; Jorm, Anthony F

    2009-01-01

    Background In-depth and structured evaluation of the stigma associated with depression has been lacking. This study aimed to inform the design of interventions to reduce stigma by systematically investigating community perceptions of beliefs about depression according to theorised dimensional components of stigma. Methods Focus group discussions were held with a total of 23 adults with personal experience of depression. The discussions were taped, transcribed and thematically analysed. Results Participants typically reported experiencing considerable stigma, particularly that others believe depressed people are responsible for their own condition, are undesirable to be around, and may be a threat. Participants expressed particular concerns about help-seeking in the workplace and from mental health professionals. Conclusion Findings indicate that interventions to reduce the stigma of depression should target attributions of blame; reduce avoidance of depressed people; label depression as a 'health condition' rather than 'mental illness'; and improve responses of help-sources (i.e. via informing professionals of client fears). PMID:19228435

  14. Bioprocess development for hexavalent chromium reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turick, C.E.; Apel, W.A.

    1996-10-01

    Hexavalent chromium (Cr(VI)) exists in the environment from anthropogenic activity and is regarded as a highly mobile and toxic pollutant. There is considerable interest in developing effective and efficient methods for the remediation of contaminated media. Many bacterial isolates have been demonstrated to metabolically reduce Cr(VI) to Cr(III), a much less toxic, more easily recoverable form of chromium. Aerobic and anaerobic cultures of Cr(VI) reducing bacteria were analyzed for their ability to reduce Cr(VI) prior to bioreactor design and scale up. Batch studies demonstrated Cr(VI) reduction rates with aerobic bacteria of up to 3 mg/hr/g dry cells, while anaerobic culturesmore » exhibited Cr(VI) reduction rates up to 22 mg/hr/g dry cells. An aerobic mixed culture of Cr(VI) reducing bacteria was chosen for bioreactor studies due to better rates of Cr(VI) reduction as well as the robust nature of the culture. These properties will allow for ease in bioprocess operation in the field.« less

  15. Effects of hot-air and hybrid hot air-microwave drying on drying kinetics and textural quality of nectarine slices

    NASA Astrophysics Data System (ADS)

    Miraei Ashtiani, Seyed-Hassan; Sturm, Barbara; Nasirahmadi, Abozar

    2018-04-01

    Drying and physicochemical characteristics of nectarine slices were investigated using hot-air and hybrid hot air-microwave drying methods under fixed air temperature and air speed (50 °C and 0.5 m/s, respectively). Microwave power levels for the combined hot air-microwave method were 80, 160, 240, and 320 W. Drying kinetics were analyzed and compared using six mathematical models. For both drying methods the model with the best fitness in explaining the drying behavior was the Midilli-Kucuk model. The coefficient of determination ( R 2), root mean square error (RMSE) and reduced chi square ( χ 2) for this model have been obtained greater than 0.999 and less than 0.006 and 0.0001 for hybrid hot air-microwave drying while those values for hot-air drying were more than 0.999 and less than 0.003 and 0.0001, respectively. Results showed that the hybrid method reduced the drying time considerably and produced products with higher quality. The range of effective moisture diffusivity ( D eff ) of hybrid and hot-air drying was between 8.15 × 10-8 and 2.83 × 10-7 m2/s and 1.27 × 10-8 m2/s, respectively. The total color difference (ΔE) has also been obtained from 36.68 to 44.27 for hybrid method; however this value for hot-air drying was found 49.64. Although reduced microwave power output led to a lower drying rate, it reduced changes in product parameters i.e. total color change, surface roughness, shrinkage and microstructural change and increased hardness and water uptake.

  16. Spacesuit Radiation Shield Design Methods

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Anderson, Brooke M.; Cucinotta, Francis A.; Ware, J.; Zeitlin, Cary J.

    2006-01-01

    Meeting radiation protection requirements during EVA is predominantly an operational issue with some potential considerations for temporary shelter. The issue of spacesuit shielding is mainly guided by the potential of accidental exposure when operational and temporary shelter considerations fail to maintain exposures within operational limits. In this case, very high exposure levels are possible which could result in observable health effects and even be life threatening. Under these assumptions, potential spacesuit radiation exposures have been studied using known historical solar particle events to gain insight on the usefulness of modification of spacesuit design in which the control of skin exposure is a critical design issue and reduction of blood forming organ exposure is desirable. Transition to a new spacesuit design including soft upper-torso and reconfigured life support hardware gives an opportunity to optimize the next generation spacesuit for reduced potential health effects during an accidental exposure.

  17. An Investigation to Improve Quality Evaluations of Primers and Propellant for 20mm Munitions

    NASA Technical Reports Server (NTRS)

    Bement, L. J.; Holmes, C.; McGrory, J.; Schimmel, M. L.

    1997-01-01

    To reduce the frequency of electrically initiated, 20mm munition hangfires (delayed ignitions), a joint Army/NASA investigation was conducted to recommend quality evaluation improvements for acceptance of both primers and gun propellant. This effort focused only on evaluating ignition and combustion performance as potential causes of hangfires: poor electrical initiation of the primer, low output performance of the primer, low ignition sensitivity of the gun propellant, and the effects of cold temperature. The goal was to determine the "best" of the Army and NASA test methods to assess the functional performance of primers and gun propellants. The approach was to evaluate the performance of both high-quality and deliberately defective primers to challenge the sensitivity of test methods. In addition, the ignition sensitivity of different manufacturing batches of gun propellants was evaluated. The results of the investigation revealed that improvements can be made in functional evaluations that can assist in identifying and reducing ignition and performance variations. The "best" functional evaluation of primers and propellant is achieved through a combination of both Army and NASA test methods. Incorporating the recommendations offered in this report may provide for considerable savings in reducing the number of cartridge firings, while significantly lowering the rejection rate of primer, propellant and cartridge lots. The most probable causes for ignition and combustion-related hangfires were the lack of calcium silicide in the primer mix, a low output performance of primers, and finally, poor ignition sensitivity of gun propellant. Cold temperatures further reduce propellant ignition sensitivity, as well as reducing burn rate and chamber pressures.

  18. Effect of natural windbreaks on drift reduction in orchard spraying.

    PubMed

    Wenneker, M; Heijne, B; van de Zande, J C

    2005-01-01

    In the Netherlands windbreaks are commonly grown to protect orchards against wind damage and to improve micro-climate. Natural windbreaks of broad-leaved trees can also reduce the risk of surface water contamination caused by spray drift during orchard spraying. Spray drift from pesticide applications is a major concern in the Netherlands, especially drift into water courses. So far, several drift reducing measures have been accepted by water quality control organisations and the Board for the Authorization of Pesticides (CTB), e.g. presence of a windbreak (i.e. 70% drift reduction at early season and 90% drift reduction at full leaf, respectively before and after first of May). From the experiments it was concluded that the risk of drift contamination is high during the early developmental stages of the growing season. The 70% drift reduction at early season as determined in previous experiments, appears to be valid only for windbreaks with a certain degree of developed leaves. At full leaf stage 80-90% drift reduction by the windbreak was measured. The use of evergreen windbreaks or wind-break species that develop in early season can reduce the risk of drift contamination considerably. Also, the combination of drift reducing methods, such as one-sided spraying of the last tree row and a windbreak is an effective method to reduce spray drift in the Netherlands in early season.

  19. Applying Magneto-rheology to Reduce Blood Viscosity and Suppress Turbulence to Prevent Heart Attacks

    NASA Astrophysics Data System (ADS)

    Tao, R.

    Heart attacks are the leading causes of death in USA. Research indicates one common thread, high blood viscosity, linking all cardiovascular diseases. Turbulence in blood circulation makes different regions of the vasculature vulnerable to development of atherosclerotic plaque. Turbulence is also responsible for systolic ejection murmurs and places heavier workload on heart, a possible trigger of heart attacks. Presently, neither medicine nor method is available to suppress turbulence. The only method to reduce the blood viscosity is to take medicine, such as aspirin. However, using medicine to reduce the blood viscosity does not help suppressing turbulence. In fact, the turbulence gets worse as the Reynolds number goes up with the viscosity reduction by the medicine. Here we report our new discovery: application of a strong magnetic field to blood along its flow direction, red blood cells are polarized in the magnetic field and aggregated into short chains along the flow direction. The blood viscosity becomes anisotropic: Along the flow direction the viscosity is significantly reduced, but in the directions perpendicular to the flow the viscosity is considerably increased. In this way, the blood flow becomes laminar, turbulence is suppressed, the blood circulation is greatly improved, and the risk for heart attacks is reduced. While these effects are not permanent, they last for about 24 hours after one magnetic therapy treatment.

  20. Generation of Non-Homogeneous Poisson Processes by Thinning: Programming Considerations and Comparision with Competing Algorithms.

    DTIC Science & Technology

    1978-12-01

    Poisson processes . The method is valid for Poisson processes with any given intensity function. The basic thinning algorithm is modified to exploit several refinements which reduce computer execution time by approximately one-third. The basic and modified thinning programs are compared with the Poisson decomposition and gap-statistics algorithm, which is easily implemented for Poisson processes with intensity functions of the form exp(a sub 0 + a sub 1t + a sub 2 t-squared. The thinning programs are competitive in both execution

  1. Proceedings: Demilitarization and Disposal Technology Conference (2nd) Held at Salt Lake City, Utah on April 24, 25, 26, 1979,

    DTIC Science & Technology

    1979-04-01

    AAP contains a wet scrubber system. The scrubber is a combination spray chamber/ venturi / marble bed unit capable of attaining a 21" WG pressure drop...requirements until the feed rates are reduced considerably. Water quality data from the scrubber show that the heavy metals and low pH to be the major water...demilitarized using this method. The process water, scrubber water, and all clean-up water are treated by a water treatment system. This treatment

  2. Thiol-modified gold nanoparticles for the inhibition of Mycobacterium smegmatis.

    PubMed

    Gifford, Jennifer C; Bresee, Jamee; Carter, Carly Jo; Wang, Guankui; Melander, Roberta J; Melander, Christian; Feldheim, Daniel L

    2014-12-28

    Antimicrobial drug discovery has slowed considerably over the last few decades. One major cause for concern is the lack of innovative approaches to treat infections caused by mycobacteria such as TB. Herein we demonstrate that our Small Molecule Variable Ligand Display (SMLVD) method for nanoparticle antibiotic discovery can be expanded around a ligand feed ratio parameter space to identify gold nanoparticle conjugates that are potent inhibitors of mycobacteria growth, with our most potent inhibitor able to reduce growth by five orders of magnitude at 8 μM.

  3. Thiol-modified gold nanoparticles for the inhibition of Mycobacterium smegmatis†

    PubMed Central

    Gifford, Jennifer C.; Bresee, Jamee; Carter, Carly Jo; Wang, Guankui; Melander, Roberta J.; Melander, Christian; Feldheim, Daniel L.

    2015-01-01

    Antimicrobial drug discovery has slowed considerably over the last few decades. One major cause for concern is the lack of innovative approaches to treat infections caused by mycobacteria such as TB. Herein we demonstrate that our Small Molecule Variable Ligand Display (SMLVD) method for nanoparticle antibiotic discovery can be expanded around a ligand feed ratio parameter space to identify gold nanoparticle conjugates that are potent inhibitors of myco-bacteria growth, with our most potent inhibitor able to reduce growth by five orders of magnitude at 8 μM. PMID:25350535

  4. Electrochemical oxidation for landfill leachate treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, Yang; Englehardt, James D.

    2007-07-01

    This paper aims at providing an overview of electrochemical oxidation processes used for treatment of landfill leachate. The typical characteristics of landfill leachate are briefly reviewed, and the reactor designs used for electro-oxidation of leachate are summarized. Electrochemical oxidation can significantly reduce concentrations of organic contaminants, ammonia, and color in leachate. Pretreatment methods, anode materials, pH, current density, chloride concentration, and other additional electrolytes can considerably influence performance. Although high energy consumption and potential chlorinated organics formation may limit its application, electrochemical oxidation is a promising and powerful technology for treatment of landfill leachate.

  5. A Lightweight Loudspeaker for Aircraft Communications and Active Noise Control

    NASA Technical Reports Server (NTRS)

    Warnaka, Glenn E.; Kleinle, Mark; Tsangaris, Parry; Oslac, Michael J.; Moskow, Harry J.

    1992-01-01

    A series of new, lightweight loudspeakers for use on commercial aircraft has been developed. The loudspeakers use NdFeB magnets and aluminum alloy frames to reduce the weight. The NdFeB magnet is virtually encapsulated by steel in the new speaker designs. Active noise reduction using internal loudspeakers was demonstrated to be effective in 1983. A weight, space, and cost efficient method for creating the active sound attenuating fields is to use the existing cabin loudspeakers for both communication and sound attenuation. This will require some additional loudspeaker design considerations.

  6. Characteristics of Perforated Diffusers at Free-stream Mach Number 1.90

    NASA Technical Reports Server (NTRS)

    Hunczak, Henry R; Kremzier, Emil J

    1950-01-01

    An investigation was conducted at Mach number 1.90 to determine pressure recovery and mass-flow characteristics of series of perforated convergent-divergent supersonic diffusers. Pressure recoveries as high as 96 percent were obtained, but at reduced mass flows through the diffuser. Theoretical considerations of effect of perforation distribution on shock stability in converging section of diffuser are presented and correlated with experimental data. A method of estimating relative importance of pressure recovery and mass flow on internal thrust coefficient basis is given and a comparison of various diffusers investigated is made.

  7. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  8. Planetary Protection Considerations For Exomars Meteorological Instrumentation.

    NASA Astrophysics Data System (ADS)

    Camilletti, Adam

    2007-10-01

    Planetary protection requirements for Oxford University's contribution to the upcoming ESA ExoMars mission are discussed and the current methods being used to fulfil these requirements are detailed and reviewed. Oxford University is supplying temperature and wind sensors to the mission and since these will be exposed to the Martian environment there is a requirement that they are sterilised to stringent COSPAR standards adhered to by ESA. Typically dry heat microbial reduction (DHMR) is used to reduce spacecraft bioburden but the high temperatures involved are not compatible with the some hardware elements. Alternative, low-temperature sterilisation methods are reviewed and their applicability to spacecraft hardware discussed. The use of a commercially available, bench-top endotoxin tester in planetary protection is also discussed and data from preliminary tests performed at Oxford are presented. These devices, which utilise the immune response of horseshoe crabs to the presence of endotoxin, have the potential to reduce the time taken to determine bioburden by removing the need for conventional assaying -a lengthy and sometimes expensive process.

  9. Reduced-rank technique for joint channel estimation in TD-SCDMA systems

    NASA Astrophysics Data System (ADS)

    Kamil Marzook, Ali; Ismail, Alyani; Mohd Ali, Borhanuddin; Sali, Adawati; Khatun, Sabira

    2013-02-01

    In time division-synchronous code division multiple access systems, increasing the system capacity by exploiting the inserting of the largest number of users in one time slot (TS) requires adding more estimation processes to estimate the joint channel matrix for the whole system. The increase in the number of channel parameters due the increase in the number of users in one TS directly affects the precision of the estimator's performance. This article presents a novel channel estimation with low complexity, which relies on reducing the rank order of the total channel matrix H. The proposed method exploits the rank deficiency of H to reduce the number of parameters that characterise this matrix. The adopted reduced-rank technique is based on truncated singular value decomposition algorithm. The algorithms for reduced-rank joint channel estimation (JCE) are derived and compared against traditional full-rank JCEs: least squares (LS) or Steiner and enhanced (LS or MMSE) algorithms. Simulation results of the normalised mean square error showed the superiority of reduced-rank estimators. In addition, the channel impulse responses founded by reduced-rank estimator for all active users offers considerable performance improvement over the conventional estimator along the channel window length.

  10. Many-Agent Controlled Teleportation of Multi-qubit Quantum Information via Quantum Entanglement Swapping

    NASA Astrophysics Data System (ADS)

    Zhang, Zhan-Jun; Liu, Yi-Min; Man, Zhong-Xiao

    2005-11-01

    We present a method to teleport multi-qubit quantum information in an easy way from a sender to a receiver via the control of many agents in a network. Only when all the agents collaborate with the quantum information receiver can the unknown states in the sender's qubits be fully reconstructed in the receiver's qubits. In our method, agents's control parameters are obtained via quantum entanglement swapping. As the realization of the many-agent controlled teleportation is concerned, compared to the recent method [C.P. Yang, et al., Phys. Rev. A 70 (2004) 022329], our present method considerably reduces the preparation difficulty of initial states and the identification difficulty of entangled states, moreover, it does not need local Hadamard operations and it is more feasible in technology. The project supported by National Natural Science Foundation of China under Grant No. 10304022

  11. Automated Simplification of Full Chemical Mechanisms

    NASA Technical Reports Server (NTRS)

    Norris, A. T.

    1997-01-01

    A code has been developed to automatically simplify full chemical mechanisms. The method employed is based on the Intrinsic Low Dimensional Manifold (ILDM) method of Maas and Pope. The ILDM method is a dynamical systems approach to the simplification of large chemical kinetic mechanisms. By identifying low-dimensional attracting manifolds, the method allows complex full mechanisms to be parameterized by just a few variables; in effect, generating reduced chemical mechanisms by an automatic procedure. These resulting mechanisms however, still retain all the species used in the full mechanism. Full and skeletal mechanisms for various fuels are simplified to a two dimensional manifold, and the resulting mechanisms are found to compare well with the full mechanisms, and show significant improvement over global one step mechanisms, such as those by Westbrook and Dryer. In addition, by using an ILDM reaction mechanism in a CID code, a considerable improvement in turn-around time can be achieved.

  12. Forecasting Construction Cost Index based on visibility graph: A network approach

    NASA Astrophysics Data System (ADS)

    Zhang, Rong; Ashuri, Baabak; Shyr, Yu; Deng, Yong

    2018-03-01

    Engineering News-Record (ENR), a professional magazine in the field of global construction engineering, publishes Construction Cost Index (CCI) every month. Cost estimators and contractors assess projects, arrange budgets and prepare bids by forecasting CCI. However, fluctuations and uncertainties of CCI cause irrational estimations now and then. This paper aims at achieving more accurate predictions of CCI based on a network approach in which time series is firstly converted into a visibility graph and future values are forecasted relied on link prediction. According to the experimental results, the proposed method shows satisfactory performance since the error measures are acceptable. Compared with other methods, the proposed method is easier to implement and is able to forecast CCI with less errors. It is convinced that the proposed method is efficient to provide considerably accurate CCI predictions, which will make contributions to the construction engineering by assisting individuals and organizations in reducing costs and making project schedules.

  13. Non-target toxicity of synthetic insecticides on the biological performance and population growth of Bracon hebetor Say.

    PubMed

    Muslim, Mohammad; Ansari, M Shafiq; Hasan, Fazil

    2018-05-24

    Bracon hebetor Say (Hymenoptera: Braconidae) is an important biological control agent of various species of order Lepidoptera and extensively used in biological control program worldwide. Present study evaluated the lethal and sublethal effects of insecticides on B. hebetor using demographic and population growth parameters. Doses of all the tested insecticides were within a maximum range of their recommended field dosages and adults were treated using residual glass vials method. For control experiments adults were treated with distilled water. Among the tested insecticides, the survivorship of various stages of B. hebetor was considerably prolonged on cyantraniliprole followed by chlorantraniliprole and shortest on chlorpyrifos and profenofos treated group. Total immature development time was prolonged in chlorpyrifos and profenofos treated group. Population growth parameters like intrinsic rate of natural increase (r m ), net reproductive rate (R 0 ), finite rate of increase (λ) and mean generation time (T c ) were considerably reduced in B. hebetor groups treated with chlorpyrifos and profenofos. However, B. hebetor groups treated with chlorantraniliprole and cyantraniliprole showed a little or no much difference in population growth parameters when compared with untreated group. It was also observed that chlorpyrifos and profenofos modified the sex ratio, thereby female emergence get reduced. On the basis of present findings it can be concluded that all tested insecticides caused considerable ecotoxic effects on B. hebetor compared to control. However, comparisons among the tested insecticides on the basis of IOBC criteria showed that chlorantraniliprol and cyntraniliprol was less toxic as compared to other insecticides tested on this biological control agent.

  14. Reducing acquisition time in clinical MRI by data undersampling and compressed sensing reconstruction

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Kieren Grant

    2015-11-01

    MRI is often the most sensitive or appropriate technique for important measurements in clinical diagnosis and research, but lengthy acquisition times limit its use due to cost and considerations of patient comfort and compliance. Once an image field of view and resolution is chosen, the minimum scan acquisition time is normally fixed by the amount of raw data that must be acquired to meet the Nyquist criteria. Recently, there has been research interest in using the theory of compressed sensing (CS) in MR imaging to reduce scan acquisition times. The theory argues that if our target MR image is sparse, having signal information in only a small proportion of pixels (like an angiogram), or if the image can be mathematically transformed to be sparse then it is possible to use that sparsity to recover a high definition image from substantially less acquired data. This review starts by considering methods of k-space undersampling which have already been incorporated into routine clinical imaging (partial Fourier imaging and parallel imaging), and then explains the basis of using compressed sensing in MRI. The practical considerations of applying CS to MRI acquisitions are discussed, such as designing k-space undersampling schemes, optimizing adjustable parameters in reconstructions and exploiting the power of combined compressed sensing and parallel imaging (CS-PI). A selection of clinical applications that have used CS and CS-PI prospectively are considered. The review concludes by signposting other imaging acceleration techniques under present development before concluding with a consideration of the potential impact and obstacles to bringing compressed sensing into routine use in clinical MRI.

  15. Assessing the potential effects and cost-effectiveness of programmatic herpes zoster vaccination of elderly in the Netherlands

    PubMed Central

    2010-01-01

    Background Herpes zoster (HZ) is a painful disease affecting a considerable part of the elderly. Programmatic HZ vaccination of elderly people may considerably reduce HZ morbidity and its related costs, but the extent of these effects is unknown. In this article, the potential effects and cost-effectiveness of programmatic HZ vaccination of elderly in the Netherlands have been assessed according to a framework that was developed to support evidence-based decision making regarding inclusion of new vaccines in the Dutch National Immunization Program. Methods An analytical framework was used combining a checklist, which structured relevant data on the vaccine, pathogen and disease, and a cost-effectiveness analysis. The cost-effectiveness analysis was performed from a societal perspective, using a Markov-cohort-model. Simultaneous vaccination with influenza was assumed. Results Due to the combination of waning immunity after vaccination and a reduced efficacy of vaccination at high ages, the most optimal cost-effectiveness ratio (€21716 per QALY) for HZ vaccination in the Netherlands was found for 70-year olds. This estimated ratio is just above the socially accepted threshold in the Netherlands of €20000 per QALY. If additional reduction of postherpetic neuralgia was included, the cost-effectiveness ratio improved (~€10000 per QALY) but uncertainty for this scenario is high. Conclusions Vaccination against HZ at the age of 70 years seems marginally cost-effective in the Netherlands. Due to limited vaccine efficacy a considerable part of the disease burden caused by HZ will remain, even with optimal acceptance of programmatic vaccination. PMID:20707884

  16. Forgiveness and Consideration of Future Consequences in Aggressive Driving

    PubMed Central

    Moore, Michael; Dahlen, Eric R.

    2008-01-01

    Most research on aggressive driving has focused on identifying aspects of driver personality which will exacerbate it (e.g., sensation seeking, impulsiveness, driving anger, etc.). The present study was designed to examine two theoretically relevant but previously unexplored personality factors predicted to reduce the risk of aggressive driving: trait forgiveness and consideration of future consequences. The utility of these variables in predicting aggressive driving and driving anger expression was evaluated among 316 college student volunteers. Hierarchical multiple regressions permitted an analysis of the incremental validity of these constructs beyond respondent gender, age, miles driven per week, and driving anger. Both forgiveness and consideration of future consequences contributed to the prediction of aggressive driving and driving anger expression, independent of driving anger. Research on aggressive driving may be enhanced by greater attention to adaptive, potentially risk-reducing traits. Moreover, forgiveness and consideration of future consequences may have implications for accident prevention. PMID:18760093

  17. Technical considerations to minimize complications of inguinal lymph node dissection

    PubMed Central

    Gupta, Manik K.; Patel, Amar P.

    2017-01-01

    Penile cancer is a rare malignancy with a high propensity for regional dissemination. Current guidelines recommend inguinal lymphadenectomy in patients with penile cancer for palpable inguinal lymph nodes or in certain cases of nonpalpable inguinal lymph nodes. For many years, this procedure was performed with a traditional open approach and carried significant morbidity due to severe lymphedema, flap necrosis, wound infections, and seroma formation. The evolution of inguinal lymphadenectomy surgery for patients with penile cancer to a more minimally invasive approach has greatly reduced the morbidity of the procedure. Complications of inguinal lymphadenectomy can be minimized with modifications in surgical approach with the use of endoscopic, robotic, and various reconstructive methods. This review focuses on various intraoperative techniques to reduce morbidity in inguinal lymphadenectomies for penile cancer. PMID:29184778

  18. A hybrid method for synthetic aperture ladar phase-error compensation

    NASA Astrophysics Data System (ADS)

    Hua, Zhili; Li, Hongping; Gu, Yongjian

    2009-07-01

    As a high resolution imaging sensor, synthetic aperture ladar data contain phase-error whose source include uncompensated platform motion and atmospheric turbulence distortion errors. Two previously devised methods, rank one phase-error estimation algorithm and iterative blind deconvolution are reexamined, of which a hybrid method that can recover both the images and PSF's without any a priori information on the PSF is built to speed up the convergence rate by the consideration in the choice of initialization. To be integrated into spotlight mode SAL imaging model respectively, three methods all can effectively reduce the phase-error distortion. For each approach, signal to noise ratio, root mean square error and CPU time are computed, from which we can see the convergence rate of the hybrid method can be improved because a more efficient initialization set of blind deconvolution. Moreover, by making a further discussion of the hybrid method, the weight distribution of ROPE and IBD is found to be an important factor that affects the final result of the whole compensation process.

  19. [An application of low-invasive access in ultrasound-guided surgery of liquid formation of the abdominal cavity and retroperitoneal space].

    PubMed

    Demin, D B; Laĭkov, A V; Funygin, M S; Chegodaeva, A A; Solodov, Iu Iu; Butina, K V

    2014-01-01

    The article presents a low-invasive method in the intraoperative ultrasound-guided surgery. The method had several steps: an access (2-3 cm) was made to a liquid formation with the following aspiration of contents, a necrotic detritus was removed through the wound tract using simultaneous ultrasound examination of efficacy of emptying the cavity with drainage. This means allowed the performance of single-stage sanitization and drainage of cavity formations, which contained the liquid and dense necrotic tissues in the lumen. The method was effective, technically workable in any surgical hospital. At the same time, it was economically reasonable, because there wasn't need to buy an additional equipment. The application of the means considerably shortened a hospital stay and the lethality was reduced.

  20. Isolation and identification of phenolic compounds from rum aged in oak barrels by high-speed countercurrent chromatography/high-performance liquid chromatography-diode array detection-electrospray ionization mass spectrometry and screening for antioxidant activity.

    PubMed

    Regalado, Erik L; Tolle, Sebastian; Pino, Jorge A; Winterhalter, Peter; Menendez, Roberto; Morales, Ana R; Rodríguez, José L

    2011-10-14

    Beverages, especially wines are well-known to contain a variety of health-beneficial bioactive substances, mainly of phenolic nature which frequently exhibit antioxidant activity. Significant information is available about the separation and identification of polyphenols from some beverages by chromatographic and spectroscopic techniques, but considerably poor is chemical data related to the polyphenolic content in rums. In this paper, a method involving the all-liquid chromatographic technique of high-speed countercurrent chromatography (HSCCC) combined with high-performance liquid chromatography coupled with diode-array detection and electrospray ionization mass spectrometry (HPLC-DAD-ESI-MS(n)) has been successfully applied for separation and identification of phenolic compounds in an aged rum. Besides, the phenolic fraction (PF) was assayed for its antioxidant effects using three different free radical in vitro assays (DPPH·, RO(2)· and spontaneous lipid peroxidation (LPO) on brain homogenates) and on ferric reducing antioxidant power (FRAP). Results showed that PF potently scavenged DPPH and strongly scavenged peroxyl radicals compared to ascorbic acid and butylated hydroxytoluene (BHT); and almost equally inhibited LPO on brain homogenates subjected to spontaneous LPO when compared to quercetin. Moreover, PF also exhibited strong reducing power. This chemical analysis illustrates the rich array of phenols in the aged rum and represents a rapid and suitable method for the isolation and identification of phenolic compounds from mixtures of considerable complexity, achieving high purity and reproducibility with the use of two separation steps. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Caffeic acid protects rat heart mitochondria against isoproterenol-induced oxidative damage

    PubMed Central

    Kumaran, Kandaswamy Senthil

    2010-01-01

    Cardiac mitochondrial dysfunction plays an important role in the pathology of myocardial infarction. The protective effects of caffeic acid on mitochondrial dysfunction in isoproterenol-induced myocardial infarction were studied in Wistar rats. Rats were pretreated with caffeic acid (15 mg/kg) for 10 days. After the pretreatment period, isoproterenol (100 mg/kg) was subcutaneously injected to rats at an interval of 24 h for 2 days to induce myocardial infarction. Isoproterenol-induced rats showed considerable increased levels of serum troponins and heart mitochondrial lipid peroxidation products and considerable decreased glutathione peroxidase and reduced glutathione. Also, considerably decreased activities of isocitrate, succinate, malate, α-ketoglutarate, and NADH dehydrogenases and cytochrome-C-oxidase were observed in the mitochondria of myocardial-infarcted rats. The mitochondrial calcium, cholesterol, free fatty acids, and triglycerides were considerably increased and adenosine triphosphate and phospholipids were considerably decreased in isoproterenol-induced rats. Caffeic acid pretreatment showed considerable protective effects on all the biochemical parameters studied. Myocardial infarct size was much reduced in caffeic acid pretreated isoproterenol-induced rats. Transmission electron microscopic findings also confirmed the protective effects of caffeic acid. The possible mechanisms of caffeic acid on cardiac mitochondria protection might be due to decreasing free radicals, increasing multienzyme activities, reduced glutathione, and adenosine triphosphate levels and maintaining lipids and calcium. In vitro studies also confirmed the free-radical-scavenging activity of caffeic acid. Thus, caffeic acid protected rat’s heart mitochondria against isoproterenol-induced damage. This study may have a significant impact on myocardial-infarcted patients. PMID:20376586

  2. Caffeic acid protects rat heart mitochondria against isoproterenol-induced oxidative damage.

    PubMed

    Kumaran, Kandaswamy Senthil; Prince, Ponnian Stanely Mainzen

    2010-11-01

    Cardiac mitochondrial dysfunction plays an important role in the pathology of myocardial infarction. The protective effects of caffeic acid on mitochondrial dysfunction in isoproterenol-induced myocardial infarction were studied in Wistar rats. Rats were pretreated with caffeic acid (15 mg/kg) for 10 days. After the pretreatment period, isoproterenol (100 mg/kg) was subcutaneously injected to rats at an interval of 24 h for 2 days to induce myocardial infarction. Isoproterenol-induced rats showed considerable increased levels of serum troponins and heart mitochondrial lipid peroxidation products and considerable decreased glutathione peroxidase and reduced glutathione. Also, considerably decreased activities of isocitrate, succinate, malate, α-ketoglutarate, and NADH dehydrogenases and cytochrome-C-oxidase were observed in the mitochondria of myocardial-infarcted rats. The mitochondrial calcium, cholesterol, free fatty acids, and triglycerides were considerably increased and adenosine triphosphate and phospholipids were considerably decreased in isoproterenol-induced rats. Caffeic acid pretreatment showed considerable protective effects on all the biochemical parameters studied. Myocardial infarct size was much reduced in caffeic acid pretreated isoproterenol-induced rats. Transmission electron microscopic findings also confirmed the protective effects of caffeic acid. The possible mechanisms of caffeic acid on cardiac mitochondria protection might be due to decreasing free radicals, increasing multienzyme activities, reduced glutathione, and adenosine triphosphate levels and maintaining lipids and calcium. In vitro studies also confirmed the free-radical-scavenging activity of caffeic acid. Thus, caffeic acid protected rat's heart mitochondria against isoproterenol-induced damage. This study may have a significant impact on myocardial-infarcted patients.

  3. A nonlocal continuum model for the biaxial buckling analysis of composite nanoplates with shape memory alloy nanowires

    NASA Astrophysics Data System (ADS)

    Farajpour, M. R.; Shahidi, A. R.; Farajpour, A.

    2018-03-01

    In this study, the buckling behavior of a three-layered composite nanoplate reinforced with shape memory alloy (SMA) nanowires is examined. Whereas the upper and lower layers are reinforced with typical nanowires, SMA nanoscale wires are used to strengthen the middle layer of the system. The composite nanoplate is assumed to be under the action of biaxial compressive loading. A scale-dependent mathematical model is presented with the consideration of size effects within the context of the Eringen’s nonlocal continuum mechanics. Using the one-dimensional Brinson’s theory and the Kirchhoff theory of plates, the governing partial differential equations of SMA nanowire-reinforced hybrid nanoplates are derived. Both lateral and longitudinal deflections are taken into consideration in the theoretical formulation and method of solution. In order to reduce the governing differential equations to their corresponding algebraic equations, a discretization approach based on the differential quadrature method is employed. The critical buckling loads of the hybrid nanosystem with various boundary conditions are obtained with the use of a standard eigenvalue solver. It is found that the stability response of SMA composite nanoplates is strongly sensitive to the small scale effect.

  4. Effects of Pretreatment on the Electronic Properties of Plasma Enhanced Chemical Vapor Deposition Hetero-Epitaxial Graphene Devices

    NASA Astrophysics Data System (ADS)

    Zhang, Lian-Chang; Shi, Zhi-Wen; Yang, Rong; Huang, Jian

    2014-09-01

    Quasi-monolayer graphene is successfully grown by the plasma enhanced chemical vapor deposition heteroepitaxial method we reported previously. To measure its electrical properties, the prepared graphene is fabricated into Hall ball shaped devices by the routine micro-fabrication method. However, impurity molecules adsorbed onto the graphene surface will impose considerable doping effects on the one-atom-thick film material. Our experiment demonstrates that pretreatment of the device by heat radiation baking and electrical annealing can dramatically influence the doping state of the graphene and consequently modify the electrical properties. While graphene in the as-fabricated device is highly p-doped, as confirmed by the position of the Dirac point at far more than +60 V, baking treatment at temperatures around 180°C can significantly lower the doping level and reduce the conductivity. The following electrical annealing is much more efficient to desorb the extrinsic molecules, as confirmed by the in situ measurement, and as a result, further modify the doping state and electrical properties of the graphene, causing a considerable drop of the conductivity and a shifting of Dirac point from beyond +60 V to 0 V.

  5. Science and ethics: Some issues for education

    NASA Astrophysics Data System (ADS)

    Andrew, Jennifer; Robottom, Ian

    2001-11-01

    Ethical issues concerning pain and suffering of animals are necessarily a consideration when it comes to killing pest or feral species in Australia. Within a continent where there are no large predators, many introduced animal species such as rabbits, foxes, horses, donkeys, camels, goats, and mice have been able to thrive, competing with the interests of farmers and graziers, and livestock and food production. These species, thus, gain the label of pest. Many methods now exist to kill these species and, consequently, ethical issues arise concerning the possible pain and suffering caused as a direct result of these methods. Yet within government and scientific communities, ethical issues are reduced to a secondary consideration without serious debate or contention. Ethical issues appear to be at odds with scientific agendas. How can environmental ethics be incorporated as part of science-based decision making that appeals to objectivity and scientific evidence? Within educational institutions as well, the same dilemma exists: How can ethical issues be addressed within the science curriculum and in the classroom? A greater understanding of various perspectives on the subject of environmental ethics and the value positions advocated by proponents of these perspectives may help teachers consider ways of handling such issues in the science classroom.

  6. Methodology, Methods, and Metrics for Testing and Evaluating Augmented Cognition Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greitzer, Frank L.

    The augmented cognition research community seeks cognitive neuroscience-based solutions to improve warfighter performance by applying and managing mitigation strategies to reduce workload and improve the throughput and quality of decisions. The focus of augmented cognition mitigation research is to define, demonstrate, and exploit neuroscience and behavioral measures that support inferences about the warfighter’s cognitive state that prescribe the nature and timing of mitigation. A research challenge is to develop valid evaluation methodologies, metrics and measures to assess the impact of augmented cognition mitigations. Two considerations are external validity, which is the extent to which the results apply to operational contexts;more » and internal validity, which reflects the reliability of performance measures and the conclusions based on analysis of results. The scientific rigor of the research methodology employed in conducting empirical investigations largely affects the validity of the findings. External validity requirements also compel us to demonstrate operational significance of mitigations. Thus it is important to demonstrate effectiveness of mitigations under specific conditions. This chapter reviews some cognitive science and methodological considerations in designing augmented cognition research studies and associated human performance metrics and analysis methods to assess the impact of augmented cognition mitigations.« less

  7. An interval-parameter mixed integer multi-objective programming for environment-oriented evacuation management

    NASA Astrophysics Data System (ADS)

    Wu, C. Z.; Huang, G. H.; Yan, X. P.; Cai, Y. P.; Li, Y. P.

    2010-05-01

    Large crowds are increasingly common at political, social, economic, cultural and sports events in urban areas. This has led to attention on the management of evacuations under such situations. In this study, we optimise an approximation method for vehicle allocation and route planning in case of an evacuation. This method, based on an interval-parameter multi-objective optimisation model, has potential for use in a flexible decision support system for evacuation management. The modeling solutions are obtained by sequentially solving two sub-models corresponding to lower- and upper-bounds for the desired objective function value. The interval solutions are feasible and stable in the given decision space, and this may reduce the negative effects of uncertainty, thereby improving decision makers' estimates under different conditions. The resulting model can be used for a systematic analysis of the complex relationships among evacuation time, cost and environmental considerations. The results of a case study used to validate the proposed model show that the model does generate useful solutions for planning evacuation management and practices. Furthermore, these results are useful for evacuation planners, not only in making vehicle allocation decisions but also for providing insight into the tradeoffs among evacuation time, environmental considerations and economic objectives.

  8. Modal ring method for the scattering of sound

    NASA Technical Reports Server (NTRS)

    Baumeister, Kenneth J.; Kreider, Kevin L.

    1993-01-01

    The modal element method for acoustic scattering can be simplified when the scattering body is rigid. In this simplified method, called the modal ring method, the scattering body is represented by a ring of triangular finite elements forming the outer surface. The acoustic pressure is calculated at the element nodes. The pressure in the infinite computational region surrounding the body is represented analytically by an eigenfunction expansion. The two solution forms are coupled by the continuity of pressure and velocity on the body surface. The modal ring method effectively reduces the two-dimensional scattering problem to a one-dimensional problem capable of handling very high frequency scattering. In contrast to the boundary element method or the method of moments, which perform a similar reduction in problem dimension, the model line method has the added advantage of having a highly banded solution matrix requiring considerably less computer storage. The method shows excellent agreement with analytic results for scattering from rigid circular cylinders over a wide frequency range (1 is equal to or less than ka is less than or equal to 100) in the near and far fields.

  9. Tailoring magnetic properties of Co nanocluster assembled films using hydrogen

    NASA Astrophysics Data System (ADS)

    Romero, C. P.; Volodin, A.; Paddubrouskaya, H.; Van Bael, M. J.; Van Haesendonck, C.; Lievens, P.

    2018-07-01

    Tailoring magnetic properties in nanocluster assembled cobalt (Co) thin films was achieved by admitting a small percentage of H2 gas (∼2%) into the Co gas phase cluster formation chamber prior to deposition. The oxygen content in the films is considerably reduced by the presence of hydrogen during the cluster formation, leading to enhanced magnetic interactions between clusters. Two sets of Co samples were fabricated, one without hydrogen gas and one with hydrogen gas. Magnetic properties of the non-hydrogenated and the hydrogen-treated Co nanocluster assembled films are comparatively studied using magnetic force microscopy and vibrating sample magnetometry. When comparing the two sets of samples the considerably larger coercive field of the H2-treated Co nanocluster film and the extended micrometer-sized magnetic domain structure confirm the enhancement of magnetic interactions between clusters. The thickness of the antiferromagnetic CoO layer is controlled with this procedure and modifies the exchange bias effect in these films. The exchange bias shift is lower for the H2-treated Co nanocluster film, which indicates that a thinner antiferromagnetic CoO reduces the coupling with the ferromagnetic Co. The hydrogen-treatment method can be used to tailor the oxidation levels thus controlling the magnetic properties of ferromagnetic cluster-assembled films.

  10. Effects of defoliation and shading on the physiological cost of reproduction in silky locoweed Oxytropis sericea

    PubMed Central

    Ida, Takashi Y.; Harder, Lawrence D.; Kudo, Gaku

    2012-01-01

    Background The production of flowers, fruits and seeds demands considerable energy and nutrients, which can limit the allocation of these resources to other plant functions and, thereby, influence survival and future reproduction. The magnitude of the physiological costs of reproduction depends on both the factors limiting seed production (pollen, ovules or resources) and the capacity of plants to compensate for high resource demand. Methods To assess the magnitude and consequences of reproductive costs, we used shading and defoliation to reduce photosynthate production by fully pollinated plants of a perennial legume, Oxytropis sericea (Fabaceae), and examined the resulting impact on photosynthate allocation, and nectar, fruit and seed production. Key Results Although these leaf manipulations reduced photosynthesis and nectar production, they did not alter photosynthate allocation, as revealed by 13C tracing, or fruit or seed production. That photosynthate allocation to reproductive organs increased >190 % and taproot mass declined by 29 % between flowering and fruiting indicates that reproduction was physiologically costly. Conclusions The insensitivity of fruit and seed production to leaf manipulation is consistent with either compensatory mobilization of stored resources or ovule limitation. Seed production differed considerably between the two years of the study in association with contrasting precipitation prior to flowering, perhaps reflecting contrasting limits on reproductive performance. PMID:22021817

  11. Site energies and charge transfer rates near pentacene grain boundaries from first-principles calculations

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hajime; Tokita, Yuichi

    2015-03-01

    Charge transfer rates near pentacene grain boundaries are derived by calculating the site energies and transfer integrals of 37 pentacene molecules using first-principles calculations. The site energies decrease considerably near the grain boundaries, and electron traps of up to 300 meV and hole barriers of up to 400 meV are generated. The charge transfer rates across the grain boundaries are found to be reduced by three to five orders of magnitude with a grain boundary gap of 4 Å because of the reduction in the transfer integrals. The electron traps and hole barriers also reduce the electron and hole transfer rates by factors of up to 10 and 50, respectively. It is essential to take the site energies into consideration to determine charge transport near the grain boundaries. We show that the complex site energy distributions near the grain boundaries can be represented by an equivalent site energy difference, which is a constant for any charge transfer pass. When equivalent site energy differences are obtained for various grain boundary structures by first-principles calculations, the effects of the grain boundaries on the charge transfer rates are introduced exactly into charge transport simulations, such as the kinetic Monte Carlo method.

  12. Effect of processing on the disappearance of pesticide residues in fresh-cut lettuce: Bioavailability and dietary risk.

    PubMed

    Camara, Miguel A; Barba, Alberto; Cermeño, Sandra; Martinez, Gracia; Oliva, Jose

    2017-12-02

    The aim of this research is to establish the processing factors of six pesticides durong the preparation of fresh-cut lettuce and to assess the risk of ingestion of pesticide residues associated with the consumption of the same. A field study was carried out on the dissipation of three insecticides (imidacloprid, tebufenozide, cypermethrin) and three fungicides (metalaxyl, tebuconazole, azoxystrobin) during treatment conditions simulating those used for commercial fresh-cut lettuce. A simultaneous residue analysis method is validated using QuEChERS extraction with acetonitrile and CG-MS and LC-MS/MS analysis. The residues detected after field application never exceed the established Maximum Residue Limits. The processing factors were generally less than 1 (between 0.34 for tebufenozide and 0.53 for imidacloprid), indicating that the process, as a whole, considerably reduces residue levels in processed lettuce compared to fresh lettuce. It is confirmed that cutting, followed by washing and drying, considerably reduces the residues. A matrix effect in the dialyzation of the pesticides is observed and the in vitro study of bioavailability establishes a low percentage of stomach absorption capacity (<15%). The EDI/ADI ratios found in all cases were well below their ADI values, and the dietary exposure assessed (EDI) in fresh-cut lettuce showed no concerns for consumer health.

  13. Optimizing highly noncoplanar VMAT trajectories: the NoVo method.

    PubMed

    Langhans, Marco; Unkelbach, Jan; Bortfeld, Thomas; Craft, David

    2018-01-16

    We introduce a new method called NoVo (Noncoplanar VMAT Optimization) to produce volumetric modulated arc therapy (VMAT) treatment plans with noncoplanar trajectories. While the use of noncoplanar beam arrangements for intensity modulated radiation therapy (IMRT), and in particular high fraction stereotactic radiosurgery (SRS), is common, noncoplanar beam trajectories for VMAT are less common as the availability of treatment machines handling these is limited. For both IMRT and VMAT, the beam angle selection problem is highly nonconvex in nature, which is why automated beam angle selection procedures have not entered mainstream clinical usage. NoVo determines a noncoplanar VMAT solution (i.e. the simultaneous trajectories of the gantry and the couch) by first computing a [Formula: see text] solution (beams from every possible direction, suitably discretized) and then eliminating beams by examing fluence contributions. Also all beam angles are scored via geometrical considerations only to find out the usefulness of the whole beam space in a very short time. A custom path finding algorithm is applied to find an optimized, continuous trajectory through the most promising beam angles using the calculated score of the beam space. Finally, using this trajectory a VMAT plan is optimized. For three clinical cases, a lung, brain, and liver case, we compare NoVo to the ideal [Formula: see text] solution, nine beam noncoplanar IMRT, coplanar VMAT, and a recently published noncoplanar VMAT algorithm. NoVo comes closest to the [Formula: see text] solution considering the lung case (brain and liver case: second), as well as improving the solution time by using geometrical considerations, followed by a time effective iterative process reducing the [Formula: see text] solution. Compared to a recently published noncoplanar VMAT algorithm, using NoVo the computation time is reduced by a factor of 2-3 (depending on the case). Compared to coplanar VMAT, NoVo reduces the objective function value by 24%, 49% and 6% for the lung, brain and liver cases, respectively.

  14. Fathers and HIV: considerations for families

    PubMed Central

    2010-01-01

    Background Fathers are intricately bound up in all aspects of family life. This review examines fathers in the presence of HIV: from desire for a child, through conception issues, to a summary of the knowledge base on fathers within families affected by HIV. Methods A mixed-methods approach is used, given the scarcity of literature. A review is provided on paternal and male factors in relation to the desire for a child, HIV testing in pregnancy, fatherhood and conception, fatherhood and drug use, paternal support and disengagement, fatherhood and men who have sex with men (MSM), and paternal effects on child development in the presence of HIV. Literature-based reviews and systematic review techniques are used to access available data Primary data are reported on the issue of parenting for men who have sex with men. Results Men with HIV desire fatherhood. This is established in studies from numerous countries, although fatherhood desires may be lower for HIV-positive men than HIV-negative men. Couples do not always agree, and in some studies, male desires for a child are greater than those of their female partners. Despite reduced fertility, support and services, many proceed to parenting, whether in seroconcordant or serodiscordant relationships. There is growing knowledge about fertility options to reduce transmission risk to uninfected partners and to offspring. Within the HIV field, there is limited research on fathering and fatherhood desires in a number of difficult-to-reach groups. There are, however, specific considerations for men who have sex with men and those affected by drug use. Conception in the presence of HIV needs to be managed and informed to reduce the risk of infection to partners and children. Further, paternal support plays a role in maternal management. Conclusions Strategies to improve HIV testing of fathers are needed. Paternal death has a negative impact on child development and paternal survival is protective. It is important to understand fathers and fathering and to approach childbirth from a family perspective. PMID:20573286

  15. Efficient Reverse-Engineering of a Developmental Gene Regulatory Network

    PubMed Central

    Cicin-Sain, Damjan; Ashyraliyev, Maksat; Jaeger, Johannes

    2012-01-01

    Understanding the complex regulatory networks underlying development and evolution of multi-cellular organisms is a major problem in biology. Computational models can be used as tools to extract the regulatory structure and dynamics of such networks from gene expression data. This approach is called reverse engineering. It has been successfully applied to many gene networks in various biological systems. However, to reconstitute the structure and non-linear dynamics of a developmental gene network in its spatial context remains a considerable challenge. Here, we address this challenge using a case study: the gap gene network involved in segment determination during early development of Drosophila melanogaster. A major problem for reverse-engineering pattern-forming networks is the significant amount of time and effort required to acquire and quantify spatial gene expression data. We have developed a simplified data processing pipeline that considerably increases the throughput of the method, but results in data of reduced accuracy compared to those previously used for gap gene network inference. We demonstrate that we can infer the correct network structure using our reduced data set, and investigate minimal data requirements for successful reverse engineering. Our results show that timing and position of expression domain boundaries are the crucial features for determining regulatory network structure from data, while it is less important to precisely measure expression levels. Based on this, we define minimal data requirements for gap gene network inference. Our results demonstrate the feasibility of reverse-engineering with much reduced experimental effort. This enables more widespread use of the method in different developmental contexts and organisms. Such systematic application of data-driven models to real-world networks has enormous potential. Only the quantitative investigation of a large number of developmental gene regulatory networks will allow us to discover whether there are rules or regularities governing development and evolution of complex multi-cellular organisms. PMID:22807664

  16. Optimizing highly noncoplanar VMAT trajectories: the NoVo method

    NASA Astrophysics Data System (ADS)

    Langhans, Marco; Unkelbach, Jan; Bortfeld, Thomas; Craft, David

    2018-01-01

    We introduce a new method called NoVo (Noncoplanar VMAT Optimization) to produce volumetric modulated arc therapy (VMAT) treatment plans with noncoplanar trajectories. While the use of noncoplanar beam arrangements for intensity modulated radiation therapy (IMRT), and in particular high fraction stereotactic radiosurgery (SRS), is common, noncoplanar beam trajectories for VMAT are less common as the availability of treatment machines handling these is limited. For both IMRT and VMAT, the beam angle selection problem is highly nonconvex in nature, which is why automated beam angle selection procedures have not entered mainstream clinical usage. NoVo determines a noncoplanar VMAT solution (i.e. the simultaneous trajectories of the gantry and the couch) by first computing a 4π solution (beams from every possible direction, suitably discretized) and then eliminating beams by examing fluence contributions. Also all beam angles are scored via geometrical considerations only to find out the usefulness of the whole beam space in a very short time. A custom path finding algorithm is applied to find an optimized, continuous trajectory through the most promising beam angles using the calculated score of the beam space. Finally, using this trajectory a VMAT plan is optimized. For three clinical cases, a lung, brain, and liver case, we compare NoVo to the ideal 4π solution, nine beam noncoplanar IMRT, coplanar VMAT, and a recently published noncoplanar VMAT algorithm. NoVo comes closest to the 4π solution considering the lung case (brain and liver case: second), as well as improving the solution time by using geometrical considerations, followed by a time effective iterative process reducing the 4π solution. Compared to a recently published noncoplanar VMAT algorithm, using NoVo the computation time is reduced by a factor of 2-3 (depending on the case). Compared to coplanar VMAT, NoVo reduces the objective function value by 24%, 49% and 6% for the lung, brain and liver cases, respectively.

  17. 78 FR 62426 - Use of Differential Income Stream as an Application of the Income Method and as a Consideration...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-22

    ... Differential Income Stream as an Application of the Income Method and as a Consideration in Assessing the Best Method; Correction AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Correcting amendment... method in connection with a cost sharing arrangement and as a specified application of the income method...

  18. A Method for Forecasting the Commercial Air Traffic Schedule in the Future

    NASA Technical Reports Server (NTRS)

    Long, Dou; Lee, David; Gaier, Eric; Johnson, Jesse; Kostiuk, Peter

    1999-01-01

    This report presents an integrated set of models that forecasts air carriers' future operations when delays due to limited terminal-area capacity are considered. This report models the industry as a whole, avoiding unnecessary details of competition among the carriers. To develop the schedule outputs, we first present a model to forecast the unconstrained flight schedules in the future, based on the assumption of rational behavior of the carriers. Then we develop a method to modify the unconstrained schedules, accounting for effects of congestion due to limited NAS capacities. Our underlying assumption is that carriers will modify their operations to keep mean delays within certain limits. We estimate values for those limits from changes in planned block times reflected in the OAG. Our method for modifying schedules takes many means of reducing the delays into considerations, albeit some of them indirectly. The direct actions include depeaking, operating in off-hours, and reducing hub airports'operations. Indirect actions include using secondary airports, using larger aircraft, and selecting new hub airports, which, we assume, have already been modeled in the FAA's TAF. Users of our suite of models can substitute an alternative forecast for the TAF.

  19. Low altitude unmanned aerial vehicle for characterising remediation effectiveness following the FDNPP accident.

    PubMed

    Martin, P G; Payton, O D; Fardoulis, J S; Richards, D A; Yamashiki, Y; Scott, T B

    2016-01-01

    On the 12th of March 2011, The Great Tōhoku Earthquake occurred 70 km off the eastern coast of Japan, generating a large 14 m high tsunami. The ensuing catalogue of events over the succeeding 12 d resulted in the release of considerable quantities of radioactive material into the environment. Important to the large-scale remediation of the affected areas is the accurate and high spatial resolution characterisation of contamination, including the verification of decontaminated areas. To enable this, a low altitude unmanned aerial vehicle equipped with a lightweight gamma-spectrometer and height normalisation system was used to produce sub-meter resolution maps of contamination. This system provided a valuable method to examine both contaminated and remediated areas rapidly, whilst greatly reducing the dose received by the operator, typically in localities formerly inaccessible to ground-based survey methods. The characterisation of three sites within Fukushima Prefecture is presented; one remediated (and a site of much previous attention), one un-remediated and a third having been subjected to an alternative method to reduce emitted radiation dose. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Small UAV Automatic Ground Collision Avoidance System Design Considerations and Flight Test Results

    NASA Technical Reports Server (NTRS)

    Sorokowski, Paul; Skoog, Mark; Burrows, Scott; Thomas, SaraKatie

    2015-01-01

    The National Aeronautics and Space Administration (NASA) Armstrong Flight Research Center Small Unmanned Aerial Vehicle (SUAV) Automatic Ground Collision Avoidance System (Auto GCAS) project demonstrated several important collision avoidance technologies. First, the SUAV Auto GCAS design included capabilities to take advantage of terrain avoidance maneuvers flying turns to either side as well as straight over terrain. Second, the design also included innovative digital elevation model (DEM) scanning methods. The combination of multi-trajectory options and new scanning methods demonstrated the ability to reduce the nuisance potential of the SUAV while maintaining robust terrain avoidance. Third, the Auto GCAS algorithms were hosted on the processor inside a smartphone, providing a lightweight hardware configuration for use in either the ground control station or on board the test aircraft. Finally, compression of DEM data for the entire Earth and successful hosting of that data on the smartphone was demonstrated. The SUAV Auto GCAS project demonstrated that together these methods and technologies have the potential to dramatically reduce the number of controlled flight into terrain mishaps across a wide range of aviation platforms with similar capabilities including UAVs, general aviation aircraft, helicopters, and model aircraft.

  1. A fast solver for the Helmholtz equation based on the generalized multiscale finite-element method

    NASA Astrophysics Data System (ADS)

    Fu, Shubin; Gao, Kai

    2017-11-01

    Conventional finite-element methods for solving the acoustic-wave Helmholtz equation in highly heterogeneous media usually require finely discretized mesh to represent the medium property variations with sufficient accuracy. Computational costs for solving the Helmholtz equation can therefore be considerably expensive for complicated and large geological models. Based on the generalized multiscale finite-element theory, we develop a novel continuous Galerkin method to solve the Helmholtz equation in acoustic media with spatially variable velocity and mass density. Instead of using conventional polynomial basis functions, we use multiscale basis functions to form the approximation space on the coarse mesh. The multiscale basis functions are obtained from multiplying the eigenfunctions of a carefully designed local spectral problem with an appropriate multiscale partition of unity. These multiscale basis functions can effectively incorporate the characteristics of heterogeneous media's fine-scale variations, thus enable us to obtain accurate solution to the Helmholtz equation without directly solving the large discrete system formed on the fine mesh. Numerical results show that our new solver can significantly reduce the dimension of the discrete Helmholtz equation system, and can also obviously reduce the computational time.

  2. A novel one-class SVM based negative data sampling method for reconstructing proteome-wide HTLV-human protein interaction networks.

    PubMed

    Mei, Suyu; Zhu, Hao

    2015-01-26

    Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.

  3. Image transmission system using adaptive joint source and channel decoding

    NASA Astrophysics Data System (ADS)

    Liu, Weiliang; Daut, David G.

    2005-03-01

    In this paper, an adaptive joint source and channel decoding method is designed to accelerate the convergence of the iterative log-dimain sum-product decoding procedure of LDPC codes as well as to improve the reconstructed image quality. Error resilience modes are used in the JPEG2000 source codec, which makes it possible to provide useful source decoded information to the channel decoder. After each iteration, a tentative decoding is made and the channel decoded bits are then sent to the JPEG2000 decoder. Due to the error resilience modes, some bits are known to be either correct or in error. The positions of these bits are then fed back to the channel decoder. The log-likelihood ratios (LLR) of these bits are then modified by a weighting factor for the next iteration. By observing the statistics of the decoding procedure, the weighting factor is designed as a function of the channel condition. That is, for lower channel SNR, a larger factor is assigned, and vice versa. Results show that the proposed joint decoding methods can greatly reduce the number of iterations, and thereby reduce the decoding delay considerably. At the same time, this method always outperforms the non-source controlled decoding method up to 5dB in terms of PSNR for various reconstructed images.

  4. Fitting the Reduced RUM with Mplus: A Tutorial

    ERIC Educational Resources Information Center

    Chiu, Chia-Yi; Köhn, Hans-Friedrich; Wu, Huey-Min

    2016-01-01

    The Reduced Reparameterized Unified Model (Reduced RUM) is a diagnostic classification model for educational assessment that has received considerable attention among psychometricians. However, the computational options for researchers and practitioners who wish to use the Reduced RUM in their work, but do not feel comfortable writing their own…

  5. Linear reduction methods for tag SNP selection.

    PubMed

    He, Jingwu; Zelikovsky, Alex

    2004-01-01

    It is widely hoped that constructing a complete human haplotype map will help to associate complex diseases with certain SNP's. Unfortunately, the number of SNP's is huge and it is very costly to sequence many individuals. Therefore, it is desirable to reduce the number of SNP's that should be sequenced to considerably small number of informative representatives, so called tag SNP's. In this paper, we propose a new linear algebra based method for selecting and using tag SNP's. Our method is purely combinatorial and can be combined with linkage disequilibrium (LD) and block based methods. We measure the quality of our tag SNP selection algorithm by comparing actual SNP's with SNP's linearly predicted from linearly chosen tag SNP's. We obtain an extremely good compression and prediction rates. For example, for long haplotypes (>25000 SNP's), knowing only 0.4% of all SNP's we predict the entire unknown haplotype with 2% accuracy while the prediction method is based on a 10% sample of the population.

  6. Global optimization algorithms to compute thermodynamic equilibria in large complex systems with performance considerations

    DOE PAGES

    Piro, M. H. A.; Simunovic, S.

    2016-03-17

    Several global optimization methods are reviewed that attempt to ensure that the integral Gibbs energy of a closed isothermal isobaric system is a global minimum to satisfy the necessary and sufficient conditions for thermodynamic equilibrium. In particular, the integral Gibbs energy function of a multicomponent system containing non-ideal phases may be highly non-linear and non-convex, which makes finding a global minimum a challenge. Consequently, a poor numerical approach may lead one to the false belief of equilibrium. Furthermore, confirming that one reaches a global minimum and that this is achieved with satisfactory computational performance becomes increasingly more challenging in systemsmore » containing many chemical elements and a correspondingly large number of species and phases. Several numerical methods that have been used for this specific purpose are reviewed with a benchmark study of three of the more promising methods using five case studies of varying complexity. A modification of the conventional Branch and Bound method is presented that is well suited to a wide array of thermodynamic applications, including complex phases with many constituents and sublattices, and ionic phases that must adhere to charge neutrality constraints. Also, a novel method is presented that efficiently solves the system of linear equations that exploits the unique structure of the Hessian matrix, which reduces the calculation from a O(N 3) operation to a O(N) operation. As a result, this combined approach demonstrates efficiency, reliability and capabilities that are favorable for integration of thermodynamic computations into multi-physics codes with inherent performance considerations.« less

  7. Global optimization algorithms to compute thermodynamic equilibria in large complex systems with performance considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piro, M. H. A.; Simunovic, S.

    Several global optimization methods are reviewed that attempt to ensure that the integral Gibbs energy of a closed isothermal isobaric system is a global minimum to satisfy the necessary and sufficient conditions for thermodynamic equilibrium. In particular, the integral Gibbs energy function of a multicomponent system containing non-ideal phases may be highly non-linear and non-convex, which makes finding a global minimum a challenge. Consequently, a poor numerical approach may lead one to the false belief of equilibrium. Furthermore, confirming that one reaches a global minimum and that this is achieved with satisfactory computational performance becomes increasingly more challenging in systemsmore » containing many chemical elements and a correspondingly large number of species and phases. Several numerical methods that have been used for this specific purpose are reviewed with a benchmark study of three of the more promising methods using five case studies of varying complexity. A modification of the conventional Branch and Bound method is presented that is well suited to a wide array of thermodynamic applications, including complex phases with many constituents and sublattices, and ionic phases that must adhere to charge neutrality constraints. Also, a novel method is presented that efficiently solves the system of linear equations that exploits the unique structure of the Hessian matrix, which reduces the calculation from a O(N 3) operation to a O(N) operation. As a result, this combined approach demonstrates efficiency, reliability and capabilities that are favorable for integration of thermodynamic computations into multi-physics codes with inherent performance considerations.« less

  8. 76 FR 80249 - Use of Differential Income Stream as a Consideration in Assessing the Best Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... Differential Income Stream as a Consideration in Assessing the Best Method AGENCY: Internal Revenue Service... method in connection with a cost sharing arrangement. The text of these temporary regulations also serves... unreasonable positions in applying the income method by using relatively low licensing discount rates, and...

  9. Toxicity of 4,346 chemicals to larval lampreys and fishes

    USGS Publications Warehouse

    Applegate, Vernon C.; Howell, John H.; Hall, A.E.; Smith, Manning A.

    1957-01-01

    The problem of controlling the sea lamprey in the upper Great Lakes has received considerable attention in recent years and requires no review here (Applegate and Moffett. 1955). Electromechanical weirs and traps and electrical barriers have been developed which can be successfully employed to block and/or destroy spawning runs of adult sea lampreys. These devices. when installed in all known $pawning streams. provide an effective method of reducing the numbers of sea lampreys in each lake basin. Initial efforts at control of the lamprey have employed these devices (Applegate. Smith. and Nielsen. 1952; Erkkila. Smith. and McLain. 1956).

  10. Committee Opinion No. 642: Increasing Access to Contraceptive Implants and Intrauterine Devices to Reduce Unintended Pregnancy.

    PubMed

    2015-10-01

    Unintended pregnancy persists as a major public health problem in the United States. Although lowering unintended pregnancy rates requires multiple approaches, individual obstetrician-gynecologists may contribute by increasing access to contraceptive implants and intrauterine devices. Obstetrician-gynecologists should encourage consideration of implants and intrauterine devices for all appropriate candidates, including nulliparous women and adolescents. Obstetrician-gynecologists should adopt best practices for long-acting reversible contraception insertion. Obstetrician-gynecologists are encouraged to advocate for coverage and appropriate payment and reimbursement for every contraceptive method by all payers in all clinically appropriate circumstances.

  11. Energy Efficiency Building Code for Commercial Buildings in Sri Lanka

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busch, John; Greenberg, Steve; Rubinstein, Francis

    2000-09-30

    1.1.1 To encourage energy efficient design or retrofit of commercial buildings so that they may be constructed, operated, and maintained in a manner that reduces the use of energy without constraining the building function, the comfort, health, or the productivity of the occupants and with appropriate regard for economic considerations. 1.1.2 To provide criterion and minimum standards for energy efficiency in the design or retrofit of commercial buildings and provide methods for determining compliance with them. 1.1.3 To encourage energy efficient designs that exceed these criterion and minimum standards.

  12. Efficient many-party controlled teleportation of multiqubit quantum information via entanglement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang Chuiping; Department of Chemistry, University of Kansas, and Kansas Center for Advanced Scientific Computing, Lawrence, Kansas 66045; Chu, Shih-I

    2004-08-01

    We present a way to teleport multiqubit quantum information from a sender to a distant receiver via the control of many agents in a network. We show that the original state of each qubit can be restored by the receiver as long as all the agents collaborate. However, even if one agent does not cooperate, the receiver cannot fully recover the original state of each qubit. The method operates essentially through entangling quantum information during teleportation, in such a way that the required auxiliary qubit resources, local operation, and classical communication are considerably reduced for the present purpose.

  13. Powered lower limb orthoses for gait rehabilitation

    PubMed Central

    Ferris, Daniel P.; Sawicki, Gregory S.; Domingo, Antoinette

    2006-01-01

    Bodyweight supported treadmill training has become a prominent gait rehabilitation method in leading rehabilitation centers. This type of locomotor training has many functional benefits but the labor costs are considerable. To reduce therapist effort, several groups have developed large robotic devices for assisting treadmill stepping. A complementary approach that has not been adequately explored is to use powered lower limb orthoses for locomotor training. Recent advances in robotic technology have made lightweight powered orthoses feasible and practical. An advantage to using powered orthoses as rehabilitation aids is they allow practice starting, turning, stopping, and avoiding obstacles during overground walking. PMID:16568153

  14. Magnetically controlled multifrequency invisibility cloak with a single shell of ferrite material

    NASA Astrophysics Data System (ADS)

    Wang, Xiaohua; Liu, Youwen

    2015-02-01

    A magnetically controlled multifrequency invisibility cloak with a single shell of the isotropic and homogeneous ferrite material has been investigated based on the scattering cancellation method from the Mie scattering theory. The analytical and simulated results have demonstrated that such this shell can drastically reduce the total scattering cross-section of this cloaking system at multiple frequencies. These multiple cloaking frequencies of this shell can be externally controlled since the magnetic permeability of ferrites is well tuned by the applied magnetic field. This may provide a potential way to design a tunable multifrequency invisibility cloak with considerable flexibility.

  15. Magnetic flux density reconstruction using interleaved partial Fourier acquisitions in MREIT.

    PubMed

    Park, Hee Myung; Nam, Hyun Soo; Kwon, Oh In

    2011-04-07

    Magnetic resonance electrical impedance tomography (MREIT) has been introduced as a non-invasive modality to visualize the internal conductivity and/or current density of an electrically conductive object by the injection of current. In order to measure a magnetic flux density signal in MREIT, the phase difference approach in an interleaved encoding scheme cancels the systematic artifacts accumulated in phase signals and also reduces the random noise effect. However, it is important to reduce scan duration maintaining spatial resolution and sufficient contrast, in order to allow for practical in vivo implementation of MREIT. The purpose of this paper is to develop a coupled partial Fourier strategy in the interleaved sampling in order to reduce the total imaging time for an MREIT acquisition, whilst maintaining an SNR of the measured magnetic flux density comparable to what is achieved with complete k-space data. The proposed method uses two key steps: one is to update the magnetic flux density by updating the complex densities using the partially interleaved k-space data and the other is to fill in the missing k-space data iteratively using the updated background field inhomogeneity and magnetic flux density data. Results from numerical simulations and animal experiments demonstrate that the proposed method reduces considerably the scanning time and provides resolution of the recovered B(z) comparable to what is obtained from complete k-space data.

  16. Is traditional contraceptive use in Moldova associated with poverty and isolation?

    PubMed

    Lyons-Amos, Mark J; Durrant, Gabriele B; Padmadas, Sabu S

    2011-05-01

    This study investigates the correlates of traditional contraceptive use in Moldova, a poor country in Europe with one of the highest proportions of traditional contraceptive method users. The high reliance on traditional methods, particularly in the context of sub-replacement level fertility rate, has not been systematically evaluated in demographic research. Using cross-sectional data on a sub-sample of 6039 sexually experienced women from the 2005 Moldovan Demographic and Health Survey, this study hypothesizes that (a) economic and spatial disadvantages increase the likelihood of traditional method use, and (b) high exposure to family planning/reproductive health (FP/RH) programmes increases the propensity to modern method use. Multilevel multinomial models are used to examine the correlates of traditional method use controlling for exposure to sexual activity, socioeconomic and demographic characteristics and data structure. The results show that economic disadvantage increases the probability of traditional method use, but the overall effect is small. Although higher family planning media exposure decreases the reliance on traditional methods among younger women, it has only a marginal effect in increasing modern method use among older women. Family planning programmes designed to encourage women to switch from traditional to modern methods have some success--although the effect is considerably reduced in regions outside of the capital Chisinau. The study concludes that FP/RH efforts directed towards the poorest may have limited impact, but interventions targeted at older women could reduce the burden of unwanted pregnancies and abortions. Addressing differentials in accessing modern methods could improve uptake in rural areas.

  17. On eco-efficient technologies to minimize industrial water consumption

    NASA Astrophysics Data System (ADS)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  18. Privacy-preserving data cube for electronic medical records: An experimental evaluation.

    PubMed

    Kim, Soohyung; Lee, Hyukki; Chung, Yon Dohn

    2017-01-01

    The aim of this study is to evaluate the effectiveness and efficiency of privacy-preserving data cubes of electronic medical records (EMRs). An EMR data cube is a complex of EMR statistics that are summarized or aggregated by all possible combinations of attributes. Data cubes are widely utilized for efficient big data analysis and also have great potential for EMR analysis. For safe data analysis without privacy breaches, we must consider the privacy preservation characteristics of the EMR data cube. In this paper, we introduce a design for a privacy-preserving EMR data cube and the anonymization methods needed to achieve data privacy. We further focus on changes in efficiency and effectiveness that are caused by the anonymization process for privacy preservation. Thus, we experimentally evaluate various types of privacy-preserving EMR data cubes using several practical metrics and discuss the applicability of each anonymization method with consideration for the EMR analysis environment. We construct privacy-preserving EMR data cubes from anonymized EMR datasets. A real EMR dataset and demographic dataset are used for the evaluation. There are a large number of anonymization methods to preserve EMR privacy, and the methods are classified into three categories (i.e., global generalization, local generalization, and bucketization) by anonymization rules. According to this classification, three types of privacy-preserving EMR data cubes were constructed for the evaluation. We perform a comparative analysis by measuring the data size, cell overlap, and information loss of the EMR data cubes. Global generalization considerably reduced the size of the EMR data cube and did not cause the data cube cells to overlap, but incurred a large amount of information loss. Local generalization maintained the data size and generated only moderate information loss, but there were cell overlaps that could decrease the search performance. Bucketization did not cause cells to overlap and generated little information loss; however, the method considerably inflated the size of the EMR data cubes. The utility of anonymized EMR data cubes varies widely according to the anonymization method, and the applicability of the anonymization method depends on the features of the EMR analysis environment. The findings help to adopt the optimal anonymization method considering the EMR analysis environment and goal of the EMR analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Optimization of diffusion-weighted single-refocused spin-echo EPI by reducing eddy-current artifacts and shortening the echo time.

    PubMed

    Shrestha, Manoj; Hok, Pavel; Nöth, Ulrike; Lienerth, Bianca; Deichmann, Ralf

    2018-03-30

    The purpose of this work was to optimize the acquisition of diffusion-weighted (DW) single-refocused spin-echo (srSE) data without intrinsic eddy-current compensation (ECC) for an improved performance of ECC postprocessing. The rationale is that srSE sequences without ECC may yield shorter echo times (TE) and thus higher signal-to-noise ratios (SNR) than srSE or twice-refocused spin-echo (trSE) schemes with intrinsic ECC. The proposed method employs dummy scans with DW gradients to drive eddy currents into a steady state before data acquisition. Parameters of the ECC postprocessing algorithm were also optimized. Simulations were performed to obtain minimum TE values for the proposed sequence and sequences with intrinsic ECC. Experimentally, the proposed method was compared with standard DW-trSE imaging, both in vitro and in vivo. Simulations showed substantially shorter TE for the proposed method than for methods with intrinsic ECC when using shortened echo readouts. Data of the proposed method showed a marked increase in SNR. A dummy scan duration of at least 1.5 s improved performance of the ECC postprocessing algorithm. Changes proposed for the DW-srSE sequence and for the parameter setting of the postprocessing ECC algorithm considerably reduced eddy-current artifacts and provided a higher SNR.

  20. Shiga Toxin–Producing Escherichia coli O157, England and Wales, 1983–2012

    PubMed Central

    Byrne, Lisa; Smith, Geraldine A.; Elson, Richard; Harris, John P.; Salmon, Roland; Smith, Robert; O’Brien, Sarah J.; Adak, Goutam K.; Jenkins, Claire

    2016-01-01

    We evaluated clinical Shiga toxin–producing Escherichia coli O157 infections in England and Wales during 1983–2012 to describe changes in microbiological and surveillance methods. A strain replacement event was captured; phage type (PT) 2 decreased to account for just 3% of cases by 2012, whereas PT8 and PT21/28 strains concurrently emerged, constituting almost two thirds of cases by 2012. Despite interventions to control and reduce transmission, incidence remained constant. However, sources of infection changed over time; outbreaks caused by contaminated meat and milk declined, suggesting that interventions aimed at reducing meat cross-contamination were effective. Petting farm and school and nursery outbreaks increased, suggesting the emergence of other modes of transmission and potentially contributing to the sustained incidence over time. Studies assessing interventions and consideration of policies and guidance should be undertaken to reduce Shiga toxin–producing E. coli O157 infections in England and Wales in line with the latest epidemiologic findings. PMID:26982243

  1. What Reasons Might the Other One Have?—Perspective Taking to Reduce Psychological Reactance in Individualists and Collectivists

    PubMed Central

    Steindl, Christina; Jonas, Eva

    2013-01-01

    Previous research has demonstrated a considerable amount of negative consequences resulting from psychological reactance. The purpose of this study was to explore opportunities to reduce the amount of reactance. Using the method of perspective taking as an intervention, the current study of 196 Austrians and 198 Filipinos examined whether reactance could be reduced and whether individualists and collectivists differ concerning reactance and their perspective taking abilities. Our results indicated that participants who took the perspective of the person who threatened them experienced less reactance than participants who did not take this approach. This was the case for people from both cultural backgrounds. Nevertheless, comparisons among the two cultural groups yielded different reactions to restrictions. This indicates that individualists are more sensitive to a self-experienced restriction than collectivists, but less sensitive to a restriction of another person. Consequently, we consider culture to be a crucial determinant in predicting the amount of reactance. PMID:23814682

  2. PROCEDURES FOR CALCULATING CESSATION LAG

    EPA Science Inventory

    Environmental regulations aimed at reducing cancer risks usually have the effect of reducing exposure to a carcinogen at the time the regulation is implemented. The reduction of cancer risk may occur shortly after the reduced exposure or after a considerable period of time. The t...

  3. Petroleum storage tank cleaning using commercial microbial culture products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, D.R.; Entzeroth, L.C.; Timmis, A.

    1995-12-31

    The removal of paraffinic bottom accumulations from refinery storage tanks represents an increasingly costly area of petroleum storage management. Microorganisms can be used to reduce paraffinic bottoms by increasing the solubility of bottom material and by increasing the wax-carrying capacity of carrier oil used in the cleaning process. The economic savings of such treatments are considerable. The process is also intrinsically safer than alternative methods, as it reduces and even eliminates the need for personnel to enter the tank during the cleaning process. Both laboratory and field sample analyses can be used to document changes in tank material during themore » treatment process. These changes include increases in volatile content and changes in wax distribution. Several case histories illustrating these physical and chemical changes are presented along with the economics of treatment.« less

  4. Ageing doctors.

    PubMed

    Lillis, Steven; Milligan, Eleanor

    2017-03-01

    Doctors are neither more nor less susceptible than the general population to the effects of ageing. The relevance of deterioration with age depends on the nature of the work undertaken. Reduced muscle strength and visual and auditory deterioration can compromise clinical ability. Accumulation of chronic disease further reduces capacity. Cognitive decline is of particular importance, as good medical care requires considerable cognitive function. Patient safety is paramount, yet older doctors are an important part of the medical workforce and their value should be recognised. Changes in patient case mix, work place support systems and individual adjustments can assist safe practice. Deterioration in health should be acknowledged and requires proactive management. Current methods of ensuring competence are inadequate for supporting ageing doctors. A new initiative is recommended comprising collaboration between regulators, colleges and employing institutions to support the ageing doctor in providing safe and effective practice. © 2017 AJA Inc.

  5. Compatibility of Segments of Thermoelectric Generators

    NASA Technical Reports Server (NTRS)

    Snyder, G. Jeffrey; Ursell, Tristan

    2009-01-01

    A method of calculating (usually for the purpose of maximizing) the power-conversion efficiency of a segmented thermoelectric generator is based on equations derived from the fundamental equations of thermoelectricity. Because it is directly traceable to first principles, the method provides physical explanations in addition to predictions of phenomena involved in segmentation. In comparison with the finite-element method used heretofore to predict (without being able to explain) the behavior of a segmented thermoelectric generator, this method is much simpler to implement in practice: in particular, the efficiency of a segmented thermoelectric generator can be estimated by evaluating equations using only hand-held calculator with this method. In addition, the method provides for determination of cascading ratios. The concept of cascading is illustrated in the figure and the definition of the cascading ratio is defined in the figure caption. An important aspect of the method is its approach to the issue of compatibility among segments, in combination with introduction of the concept of compatibility within a segment. Prior approaches involved the use of only averaged material properties. Two materials in direct contact could be examined for compatibility with each other, but there was no general framework for analysis of compatibility. The present method establishes such a framework. The mathematical derivation of the method begins with the definition of reduced efficiency of a thermoelectric generator as the ratio between (1) its thermal-to-electric power-conversion efficiency and (2) its Carnot efficiency (the maximum efficiency theoretically attainable, given its hot- and cold-side temperatures). The derivation involves calculation of the reduced efficiency of a model thermoelectric generator for which the hot-side temperature is only infinitesimally greater than the cold-side temperature. The derivation includes consideration of the ratio (u) between the electric current and heat-conduction power and leads to the concept of compatibility factor (s) for a given thermoelectric material, defined as the value of u that maximizes the reduced efficiency of the aforementioned model thermoelectric generator.

  6. On the vibrational characteristics of single- and double-walled carbon nanotubes containing ice nanotube in aqueous environment

    NASA Astrophysics Data System (ADS)

    Ansari, R.; Ajori, S.; Ameri, A.

    2015-10-01

    The properties and behavior of carbon nanotubes (CNTs) in aqueous environment due to their considerable potential applications in nanobiotechnology and designing nanobiosensors have attracted the attention of researchers. In this study, molecular dynamics simulations are carried out to investigate the vibrational characteristics of single- and double-walled CNTs containing ice nanotubes (a new phase of ice) in vacuum and aqueous environments. The results demonstrate that formation of ice nanotubes inside the CNTs reduces the natural frequency of pure CNTs. Moreover, it is demonstrated that increasing the number of walls considerably reduces the sensitivity of frequency to the presence of ice nanotube inside CNT. Additionally, it is shown that increasing the length decreases the effect of ice nanotube on reducing the frequency. The calculation of natural frequency of CNTs in aqueous media demonstrates that the interaction of CNTs with water molecules considerably reduces the natural frequency up to 50 %. Finally, it is demonstrated that in the case of CNTs with one free end in aqueous environment, the CNT does not vibrate in its first mode, and its frequency is between the frequencies of first and second modes of vibration.

  7. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  8. Simple glucose reduction route for one-step synthesis of copper nanofluids

    NASA Astrophysics Data System (ADS)

    Shenoy, U. Sandhya; Shetty, A. Nityananda

    2014-01-01

    One-step method has been employed in the synthesis of copper nanofluids. Copper nitrate is reduced by glucose in the presence of sodium lauryl sulfate. The synthesized particles are characterized by X-ray diffraction technique for the phase structure; electron diffraction X-ray analysis for chemical composition; transmission electron microscopy and field emission scanning electron microscopy for the morphology; Fourier-transform infrared spectroscopy and ultraviolet-visible spectroscopy for the analysis of ingredients of the solution. Thermal conductivity, sedimentation and rheological measurements have also been carried out. It is found that the reaction parameters have considerable effect on the size of the particle formed and rate of the reaction. The techniques confirm that the synthesized particles are copper. The reported method showed promising increase in the thermal conductivity of the base fluid and is found to be reliable, simple and cost-effective method for preparing heat transfer fluids with higher stability.

  9. Singular perturbations and time scales in the design of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Naidu, Desineni S.; Price, Douglas B.

    1988-01-01

    The results are presented of application of the methodology of Singular Perturbations and Time Scales (SPATS) to the control of digital flight systems. A block diagonalization method is described to decouple a full order, two time (slow and fast) scale, discrete control system into reduced order slow and fast subsystems. Basic properties and numerical aspects of the method are discussed. A composite, closed-loop, suboptimal control system is constructed as the sum of the slow and fast optimal feedback controls. The application of this technique to an aircraft model shows close agreement between the exact solutions and the decoupled (or composite) solutions. The main advantage of the method is the considerable reduction in the overall computational requirements for the evaluation of optimal guidance and control laws. The significance of the results is that it can be used for real time, onboard simulation. A brief survey is also presented of digital flight systems.

  10. Time Domain Estimation of Arterial Parameters using the Windkessel Model and the Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Gostuski, Vladimir; Pastore, Ignacio; Rodriguez Palacios, Gaspar; Vaca Diez, Gustavo; Moscoso-Vasquez, H. Marcela; Risk, Marcelo

    2016-04-01

    Numerous parameter estimation techniques exist for characterizing the arterial system using electrical circuit analogs. However, they are often limited by their requirements and usually high computational burdain. Therefore, a new method for estimating arterial parameters based on Monte Carlo simulation is proposed. A three element Windkessel model was used to represent the arterial system. The approach was to reduce the error between the calculated and physiological aortic pressure by randomly generating arterial parameter values, while keeping constant the arterial resistance. This last value was obtained for each subject using the arterial flow, and was a necessary consideration in order to obtain a unique set of values for the arterial compliance and peripheral resistance. The estimation technique was applied to in vivo data containing steady beats in mongrel dogs, and it reliably estimated Windkessel arterial parameters. Further, this method appears to be computationally efficient for on-line time-domain estimation of these parameters.

  11. Acquisition of Robotic Giant-swing Motion Using Reinforcement Learning and Its Consideration of Motion Forms

    NASA Astrophysics Data System (ADS)

    Sakai, Naoki; Kawabe, Naoto; Hara, Masayuki; Toyoda, Nozomi; Yabuta, Tetsuro

    This paper argues how a compact humanoid robot can acquire a giant-swing motion without any robotic models by using Q-Learning method. Generally, it is widely said that Q-Learning is not appropriated for learning dynamic motions because Markov property is not necessarily guaranteed during the dynamic task. However, we tried to solve this problem by embedding the angular velocity state into state definition and averaging Q-Learning method to reduce dynamic effects, although there remain non-Markov effects in the learning results. The result shows how the robot can acquire a giant-swing motion by using Q-Learning algorithm. The successful acquired motions are analyzed in the view point of dynamics in order to realize a functionally giant-swing motion. Finally, the result shows how this method can avoid the stagnant action loop at around the bottom of the horizontal bar during the early stage of giant-swing motion.

  12. Estimation of Lightning Levels on a Launcher Using a BEM-Compressed Model

    NASA Astrophysics Data System (ADS)

    Silly, J.; Chaigne, B.; Aspas-Puertolas, J.; Herlem, Y.

    2016-05-01

    As development cycles in the space industry are being considerably reduced, it seems mandatory to deploy in parallel fast analysis methods for engineering purposes, but without sacrificing accuracy. In this paper we present the application of such methods to early Phase A-B [1] evaluation of lightning constraints on a launch vehicle.A complete 3D parametric model of a launcher has been thus developed and simulated with a Boundary Element Method (BEM)-frequency simulator (equipped with a low frequency algorithm). The time domain values of the observed currents and fields are obtained by post-treatment using an inverse discrete Fourier transform (IDFT).This model is used for lightning studies, especially the simulation are useful to analyse the influence of lightning injected currents on resulting circulated currents on external cable raceways. The description of the model and some of those results are presented in this article.

  13. Invariant and partially-invariant solutions of the equations describing a non-stationary and isentropic flow for an ideal and compressible fluid in (3 + 1) dimensions

    NASA Astrophysics Data System (ADS)

    Grundland, A. M.; Lalague, L.

    1996-04-01

    This paper presents a new method of constructing, certain classes of solutions of a system of partial differential equations (PDEs) describing the non-stationary and isentropic flow for an ideal compressible fluid. A generalization of the symmetry reduction method to the case of partially-invariant solutions (PISs) has been formulated. We present a new algorithm for constructing PISs and discuss in detail the necessary conditions for the existence of non-reducible PISs. All these solutions have the defect structure 0305-4470/29/8/019/img1 and are computed from four-dimensional symmetric subalgebras. These theoretical considerations are illustrated by several examples. Finally, some new classes of invariant solutions obtained by the symmetry reduction method are included. These solutions represent central, conical, rational, spherical, cylindrical and non-scattering double waves.

  14. Varicose vein therapy and nerve lesions.

    PubMed

    Hirsch, Tobias

    2017-03-01

    Treating varicose veins using endovenous thermal techniques - especially laser and radio frequency ablation - has emerged as an effective alternative to open surgery with stripping and high ligation. Even though these methods are very gentle and patient-friendly, they are nevertheless accompanied by risks and side effects. Compared to open surgical therapy, the risk of damage to peripheral and motor nerves is reduced; however, it still exists as a result of heat exposure and tumescent anaesthesia. Non-thermal methods that can be applied without tumescent anaesthesia have been introduced to the market. They pose a considerably lower risk of nerve lesions while proving to be much more effective. This paper investigates data on postoperative nerve damage and paraesthesia using internet research (PubMed). It analyses the current state of knowledge regarding non-thermal treatment methods and takes into account the latest developments in the use of cyanoacrylate to close insufficient saphenous veins.

  15. Open Rotor Noise Prediction Methods at NASA Langley- A Technology Review

    NASA Technical Reports Server (NTRS)

    Farassat, F.; Dunn, Mark H.; Tinetti, Ana F.; Nark, Douglas M.

    2009-01-01

    Open rotors are once again under consideration for propulsion of the future airliners because of their high efficiency. The noise generated by these propulsion systems must meet the stringent noise standards of today to reduce community impact. In this paper we review the open rotor noise prediction methods available at NASA Langley. We discuss three codes called ASSPIN (Advanced Subsonic-Supersonic Propeller Induced Noise), FW - Hpds (Ffowcs Williams-Hawkings with penetrable data surface) and the FSC (Fast Scattering Code). The first two codes are in the time domain and the third code is a frequency domain code. The capabilities of these codes and the input data requirements as well as the output data are presented. Plans for further improvements of these codes are discussed. In particular, a method based on equivalent sources is outlined to get rid of spurious signals in the FW - Hpds code.

  16. Application of digital terrain data to quantify and reduce the topographic effect on LANDSAT data

    NASA Technical Reports Server (NTRS)

    Justice, C. O.; Wharton, S. W.; Holben, B. N. (Principal Investigator)

    1980-01-01

    Integration of LANDSAT multispectral scanner (MSS) data with 30 m U.S. Geological Survey (USGS) digital terrain data was undertaken to quantify and reduce the topographic effect on imagery of a forested mountain ridge test site in central Pennsylvania. High Sun angle imagery revealed variation of as much as 21 pixel values in data for slopes of different angles and aspects with uniform surface cover. Large topographic effects were apparent in MSS 4 and 5 was due to a combination of high absorption by the forest cover and the MSS quantization. Four methods for reducing the topographic effect were compared. Band ratioing of MSS 6/5 and MSS 7/5 did not eliminate the topographic effect because of the lack of variation in MSS 4 and 5 radiances. The three radiance models examined to reduce the topographic effect required integration of the digital terrain data. Two Lambertian models increased the variation in the LANDSAT radiances. The nonLambertian model considerably reduced (86 per cent) the topographic effect in the LANDSAT data. The study demonstrates that high quality digital terrain data, as provided by the USGS digital elevation model data, can be used to enhance the utility of multispectral satellite data.

  17. Validating the operational bias and hypothesis of universal exponent in landslide frequency-area distribution.

    PubMed

    Huang, Jr-Chuan; Lee, Tsung-Yu; Teng, Tse-Yang; Chen, Yi-Chin; Huang, Cho-Ying; Lee, Cheing-Tung

    2014-01-01

    The exponent decay in landslide frequency-area distribution is widely used for assessing the consequences of landslides and with some studies arguing that the slope of the exponent decay is universal and independent of mechanisms and environmental settings. However, the documented exponent slopes are diverse and hence data processing is hypothesized for this inconsistency. An elaborated statistical experiment and two actual landslide inventories were used here to demonstrate the influences of the data processing on the determination of the exponent. Seven categories with different landslide numbers were generated from the predefined inverse-gamma distribution and then analyzed by three data processing procedures (logarithmic binning, LB, normalized logarithmic binning, NLB and cumulative distribution function, CDF). Five different bin widths were also considered while applying LB and NLB. Following that, the maximum likelihood estimation was used to estimate the exponent slopes. The results showed that the exponents estimated by CDF were unbiased while LB and NLB performed poorly. Two binning-based methods led to considerable biases that increased with the increase of landslide number and bin width. The standard deviations of the estimated exponents were dependent not just on the landslide number but also on binning method and bin width. Both extremely few and plentiful landslide numbers reduced the confidence of the estimated exponents, which could be attributed to limited landslide numbers and considerable operational bias, respectively. The diverse documented exponents in literature should therefore be adjusted accordingly. Our study strongly suggests that the considerable bias due to data processing and the data quality should be constrained in order to advance the understanding of landslide processes.

  18. High-speed architecture for the decoding of trellis-coded modulation

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    Since 1971, when the Viterbi Algorithm was introduced as the optimal method of decoding convolutional codes, improvements in circuit technology, especially VLSI, have steadily increased its speed and practicality. Trellis-Coded Modulation (TCM) combines convolutional coding with higher level modulation (non-binary source alphabet) to provide forward error correction and spectral efficiency. For binary codes, the current stare-of-the-art is a 64-state Viterbi decoder on a single CMOS chip, operating at a data rate of 25 Mbps. Recently, there has been an interest in increasing the speed of the Viterbi Algorithm by improving the decoder architecture, or by reducing the algorithm itself. Designs employing new architectural techniques are now in existence, however these techniques are currently applied to simpler binary codes, not to TCM. The purpose of this report is to discuss TCM architectural considerations in general, and to present the design, at the logic gate level, or a specific TCM decoder which applies these considerations to achieve high-speed decoding.

  19. Phosphate interference during in situ treatment for arsenic in groundwater.

    PubMed

    Brunsting, Joseph H; McBean, Edward A

    2014-01-01

    Contamination of groundwater by arsenic is a problem in many areas of the world, particularly in West Bengal (India) and Bangladesh, where reducing conditions in groundwater are the cause. In situ treatment is a novel approach wherein, by introduction of dissolved oxygen (DO), advantages over other treatment methods can be achieved through simplicity, not using chemicals, and not requiring disposal of arsenic-rich wastes. A lab-scale test of in situ treatment by air sparging, using a solution with approximately 5.3 mg L(-1) ferrous iron and 200 μg L(-1) arsenate, showed removal of arsenate in the range of 59%. A significant obstacle exists, however, due to the interference of phosphate since phosphate competes for adsorption sites on oxidized iron precipitates. A lab-scale test including 0.5 mg L(-1) phosphate showed negligible removal of arsenate. In situ treatment by air sparging demonstrates considerable promise for removal of arsenic from groundwater where iron is present in considerable quantities and phosphates are low.

  20. Mission to Mars using integrated propulsion concepts: considerations, opportunities, and strategies.

    PubMed

    Accettura, Antonio G; Bruno, Claudio; Casotto, Stefano; Marzari, Francesco

    2004-04-01

    The aim of this paper is to evaluate the feasibility of a mission to Mars using the Integrated Propulsion Systems (IPS) which means to couple Nuclear-MPD-ISPU propulsion systems. In particular both mission analysis and propulsion aspects are analyzed together with technological aspects. Identifying possible mission scenarios will lead to the study of possible strategies for Mars Exploration and also of methods for reducing cost. As regard to IPS, the coupling between Nuclear Propulsion (Rubbia's engine) and Superconductive MPD propulsion is considered for the Earth-Mars trajectories: major emphasis is given to the advantages of such a system. The In Situ Resource Utilization (ISRU) concerns on-Mars operations; In Situ Propellant Utilization (ISPU) is foreseen particularly for LOX-CH4 engines for Mars Ascent Vehicles and this possibility is analyzed from a technological point of view. Tether Systems are also considered during interplanetary trajectories and as space elevators on Mars orbit. Finally strategic considerations associated to this mission are considered also. c2003 Elsevier Ltd. All rights reserved.

  1. Biological Contributions to Addictions in Adolescents and Adults: Prevention, Treatment and Policy Implications

    PubMed Central

    Potenza, Marc N.

    2012-01-01

    Purpose Despite significant advances in our understanding of the biological bases of addictions, these disorders continue to represent a huge public health burden that is associated with substantial personal suffering. Efforts to target addictions require consideration of how the improved biological understanding of addictions may lead to improved prevention, treatment and policy initiatives. Method In this article, we provide a narrative review of current biological models for addictions with a goal of placing existing data and theories within a translational and developmental framework targeting the advancement of prevention, treatment and policy strategies. Results Data regarding individual differences, intermediary phenotypes, and main and interactive influences of genetic and environmental contributions in the setting of developmental trajectories that may be influenced by addictive drugs or behavior indicate complex underpinnings of addictions. Conclusions Consideration and further elucidation of the biological etiologies of addictions hold significant potential for making important gains and reducing the public health impact of addictions. PMID:23332567

  2. Ozone Air Quality over North America: Part II-An Analysis of Trend Detection and Attribution Techniques.

    PubMed

    Porter, P Steven; Rao, S Trivikrama; Zurbenko, Igor G; Dunker, Alan M; Wolff, George T

    2001-02-01

    Assessment of regulatory programs aimed at improving ambient O 3 air quality is of considerable interest to the scientific community and to policymakers. Trend detection, the identification of statistically significant long-term changes, and attribution, linking change to specific clima-tological and anthropogenic forcings, are instrumental to this assessment. Detection and attribution are difficult because changes in pollutant concentrations of interest to policymakers may be much smaller than natural variations due to weather and climate. In addition, there are considerable differences in reported trends seemingly based on similar statistical methods and databases. Differences arise from the variety of techniques used to reduce nontrend variation in time series, including mitigating the effects of meteorology and the variety of metrics used to track changes. In this paper, we review the trend assessment techniques being used in the air pollution field and discuss their strengths and limitations in discerning and attributing changes in O 3 to emission control policies.

  3. The safety of available and emerging options for emergency contraception.

    PubMed

    Lee, Jessica K; Schwarz, Eleanor Bimla

    2017-10-01

    Emergency contraception (EC) is a way to significantly reduce the chance of becoming pregnant after an episode of unprotected intercourse. Considerable data support the safety of all available and emerging options for EC. Areas covered: This review presents a comprehensive summary of the literature regarding the safety of EC as well as directions for further study. PubMed was searched for all relevant studies published prior to June 2017. Expertopinion: All available methods of EC (i.e., ulipristal acetate pills, levonorgestrel pills, and the copper-IUD), carry only mild side effects and serious adverse events are essentially unknown. The copper IUD has the highest efficacy of EC methods. Given the excellent safety profiles of mifepristone and the levonorgestrel IUD, research is ongoing related to use of these products for EC.

  4. Efficient shortcut techniques in evanescently coupled waveguides

    NASA Astrophysics Data System (ADS)

    Paul, Koushik; Sarma, Amarendra K.

    2016-10-01

    Shortcut to Adiabatic Passage (SHAPE) technique, in the context of coherent control of atomic systems has gained considerable attention in last few years. It is primarily because of its ability to manipulate population among the quantum states infinitely fast compared to the adiabatic processes. Two methods in this regard have been explored rigorously, namely the transitionless quantum driving and the Lewis-Riesenfeld invariant approach. We have applied these two methods to realize SHAPE in adiabatic waveguide coupler. Waveguide couplers are integral components of photonic circuits, primarily used as switching devices. Our study shows that with appropriate engineering of the coupling coefficient and propagation constants of the coupler it is possible to achieve efficient and complete power switching. We also observed that the coupler length could be reduced significantly without affecting the coupling efficiency of the system.

  5. Mechanism, synthesis and modification of nano zerovalent iron in water treatment

    NASA Astrophysics Data System (ADS)

    Lu, Hai-Jiao; Wang, Jing-Kang; Ferguson, Steven; Wang, Ting; Bao, Ying; Hao, Hong-Xun

    2016-05-01

    Owing to its strong reducing ability, high reaction activity, excellent adsorption properties, good mobility and relatively low cost, nano zerovalent iron (nZVI) is an extremely promising nanomaterial for use in water treatment. In this paper, the working mechanisms of nZVI in the degradation of various contaminants in water are outlined and discussed. Synthesis methods and their respective advantages and disadvantages are discussed in detail. Furthermore, a variety of modification methods which have been developed to improve the mobility and stability of nZVI as well as to facilitate the separation of nZVI from degraded systems are also summarized and discussed. Numerous studies indicate that nZVI has considerable potential to become an efficient, versatile and practical approach for large-scale water treatment.

  6. Toward a Safer and Cleaner Way: Dealing With Human Waste in Healthcare.

    PubMed

    Apple, Michael

    2016-07-01

    Organizations must evaluate their infection control plans in a holistic and inclusive manner to continue reducing healthcare-associated infection (HAI) rates, including giving consideration to the manner of collecting and disposing of patient waste. Manual washing of bedpans and other containers poses a risk of spreading infection via caregivers, the environment, and the still-contaminated bedpan. Several alternative disposal methods are available and have been tested in some countries for decades, including options such as bedpan washer-disinfector machines, macerator machines, and disposable bedpans. This article reviews methods and issues related to human waste disposal in healthcare settings. Healthcare organizations must evaluate the options thoroughly and then consistently implement the option most in line with its goals and culture. © The Author(s) 2016.

  7. Effect of ion implantation on the tribology of metal-on-metal hip prostheses.

    PubMed

    Bowsher, John G; Hussain, Azad; Williams, Paul; Nevelos, Jim; Shelton, Julia C

    2004-12-01

    Nitrogen ion implantation (which considerably hardens the surface of the bearing) may represent one possible method of reducing the wear of metal-on-metal (MOM) hip bearings. Currently there are no ion-implanted MOM bearings used clinically. Therefore a physiological hip simulator test was undertaken using standard test conditions, and the results compared to previous studies using the same methods. N2-ion implantation of high carbon cast Co-Cr-Mo-on-Co-Cr-Mo hip prostheses increased wear by 2-fold during the aggressive running-in phase compared to untreated bearing surfaces, plus showing no wear reductions during steady-state conditions. Although 2 specimens were considered in the current study, it would appear that ion implantation has no clinical benefit for MOM.

  8. Behavior Therapy for Tourette Syndrome: A Systematic Review and Meta-analysis.

    PubMed

    Wile, Daryl J; Pringsheim, Tamara M

    2013-08-01

    When tics caused by Tourette Syndrome cause meaningful impairment for patients, a comprehensive treatment approach includes education of patients, peers, and family, treatment of comorbid behavioral disorders if present, and consideration of behavior therapy and pharmacotherapy for tics themselves. This systematic review and meta-analysis demonstrates that behavior therapies based on Habit Reversal Therapy, including the Comprehensive Behavioral Intervention for Tics are effective in reducing tic severity when compared with supportive psychotherapy. When these behavior therapies are unavailable, Exposure with Response Prevention may also be effective. Both face-to-face and telehealth delivery methods for behavior therapy improve tic severity, and broader distribution of behavior therapy through increased training or telehealth methods is encouraged. High-quality randomized trials comparing behavior therapies for tics with pharmacotherapy are needed.

  9. Alternative methods for C.R.A pipeline welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belloni, A.

    1996-12-01

    The application of the GTAW process for C.R.A. (Corrosion Resistant Alloy) linepipe welding is a well known practice, nevertheless the high construction costs associated to the use of this process particularly for root pass welding (low welding speed) necessitate consideration of an updated version of the GMAW process (higher welding speed) for the same application. The present paper describes the progress obtained in using alternative welding methods to GTAW Cold wire such as: GTAW hot wire and GMAW to increase the welding speed and consequently reduce the overall project cost. The authors feel that this approach is essential to increasemore » the pipelaying productivity of a C.R.A. linepipe materials, at present still too far from that of the Carbon steel one.« less

  10. Reassuring and reducing anxiety in seriously injured patients: a study of accident and emergency interventions.

    PubMed

    Jay, R

    1996-07-01

    This paper is an account of a qualitative study of significantly injured patients' experience in the Accident and Emergency department (A&E) and explores the question 'What A&E nursing interventions are effective in providing reassurance and reducing anxiety in patients with significant trauma?' A review of the limited literature available indicates that when a seriously injured patient is admitted to the A&E department, the medical and nursing staff often respond urgently to the physiological crisis without adequate consideration of psychological needs. The research design used a version of the Critical Incident Technique to interview 7 patients who had sustained serious injuries a few days previously. The findings indicate that central to the delivery of emergency care is the individual's transition from their normal independent existence through pre-hospital trauma and into the isolating experience of fear, dependence and the resuscitation room. Different methods of coping were required to meet their needs and regain some control. Methods such as touch, company and information became paramount as did the need to trust the people seen to be in control of their new environment.

  11. Evaluation of Aesthetic Function and Thermal Modification of Vertical Greenery at Bogor City, Indonesia

    NASA Astrophysics Data System (ADS)

    Sulistyantara, B.; Sesara, R.

    2017-10-01

    Bogor city currently develops vertical greenery due to counter the decreasing of green space quantity. Vertical greenery is a planting method using vertical structure similar to retaining walls. There are some benefits of vertical greenery, such as providing aesthetics value of the landscape, to protect from the heat, to reduce noise, and to reduce pollution. The purpose of this study were to identify thermal modification by vertical greenery in Bogor city, to assess the aesthetics value from vertical greenery, and to provide a recommendation in attempt to manage and improve the quality of vertical greenery in Bogor city. The study was conducted using Scenic Beauty Estimation method, and was done by providing questionnaires to the respondents in order to assess the aesthetics value of vertical greenery. Infrared thermometer was also used to measure the surface’s temperature to evaluate thermal modification function of the vertical greenery. The result of study proved that vertical greenery in the Bogor city has considerably good aesthetic. It also showed that there is a decreasing in surface temperature of the vertical greenery structure.

  12. A model-free method for mass spectrometer response correction. [for oxygen consumption and cardiac output calculation

    NASA Technical Reports Server (NTRS)

    Shykoff, Barbara E.; Swanson, Harvey T.

    1987-01-01

    A new method for correction of mass spectrometer output signals is described. Response-time distortion is reduced independently of any model of mass spectrometer behavior. The delay of the system is found first from the cross-correlation function of a step change and its response. A two-sided time-domain digital correction filter (deconvolution filter) is generated next from the same step response data using a regression procedure. Other data are corrected using the filter and delay. The mean squared error between a step response and a step is reduced considerably more after the use of a deconvolution filter than after the application of a second-order model correction. O2 consumption and CO2 production values calculated from data corrupted by a simulated dynamic process return to near the uncorrupted values after correction. Although a clean step response or the ensemble average of several responses contaminated with noise is needed for the generation of the filter, random noise of magnitude not above 0.5 percent added to the response to be corrected does not impair the correction severely.

  13. Presequence-Independent Mitochondrial Import of DNA Ligase Facilitates Establishment of Cell Lines with Reduced mtDNA Copy Number

    PubMed Central

    Spadafora, Domenico; Kozhukhar, Natalia; Alexeyev, Mikhail F.

    2016-01-01

    Due to the essential role played by mitochondrial DNA (mtDNA) in cellular physiology and bioenergetics, methods for establishing cell lines with altered mtDNA content are of considerable interest. Here, we report evidence for the existence in mammalian cells of a novel, low- efficiency, presequence-independent pathway for mitochondrial protein import, which facilitates mitochondrial uptake of such proteins as Chlorella virus ligase (ChVlig) and Escherichia coli LigA. Mouse cells engineered to depend on this pathway for mitochondrial import of the LigA protein for mtDNA maintenance had severely (up to >90%) reduced mtDNA content. These observations were used to establish a method for the generation of mouse cell lines with reduced mtDNA copy number by, first, transducing them with a retrovirus encoding LigA, and then inactivating in these transductants endogenous Lig3 with CRISPR-Cas9. Interestingly, mtDNA depletion to an average level of one copy per cell proceeds faster in cells engineered to maintain mtDNA at low copy number. This makes a low-mtDNA copy number phenotype resulting from dependence on mitochondrial import of DNA ligase through presequence-independent pathway potentially useful for rapidly shifting mtDNA heteroplasmy through partial mtDNA depletion. PMID:27031233

  14. Reducing variation in decomposition odour profiling using comprehensive two-dimensional gas chromatography.

    PubMed

    Perrault, Katelynn A; Stefanuto, Pierre-Hugues; Stuart, Barbara H; Rai, Tapan; Focant, Jean-François; Forbes, Shari L

    2015-01-01

    Challenges in decomposition odour profiling have led to variation in the documented odour profile by different research groups worldwide. Background subtraction and use of controls are important considerations given the variation introduced by decomposition studies conducted in different geographical environments. The collection of volatile organic compounds (VOCs) from soil beneath decomposing remains is challenging due to the high levels of inherent soil VOCs, further confounded by the use of highly sensitive instrumentation. This study presents a method that provides suitable chromatographic resolution for profiling decomposition odour in soil by comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry using appropriate controls and field blanks. Logarithmic transformation and t-testing of compounds permitted the generation of a compound list of decomposition VOCs in soil. Principal component analysis demonstrated the improved discrimination between experimental and control soil, verifying the value of the data handling method. Data handling procedures have not been well documented in this field and standardisation would thereby reduce misidentification of VOCs present in the surrounding environment as decomposition byproducts. Uniformity of data handling and instrumental procedures will reduce analytical variation, increasing confidence in the future when investigating the effect of taphonomic variables on the decomposition VOC profile. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. 23 CFR 650.709 - Special considerations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... candidates in States that have not previously been allocated discretionary bridge funds. In addition, consideration will be given to candidates that receive additional funds or contributions from local, State... share of the project. These funds or contributions may be used to reduce the total project cost for use...

  16. Can male vaccination reduce the burden of human papillomavirus-related disease in the United States?

    PubMed

    Low, Garren M I; Attiga, Yasser S; Garg, Gaurav; Schlegal, Richard; Gallicano, G Ian

    2012-06-01

    Human papillomavirus (HPV) can cause cervical cancer, as well as a number of other diseases in both men and women. Both sexes play a role in transmission of the disease, but the cost-effectiveness of HPV vaccination differs between them. It is necessary to determine the best allocation of limited resources between these two populations to produce the most effective strategy for reducing the burden from HPV-related disease. This literature review intends to elucidate the economic and social considerations that will lead to maximum utilization of vaccination programs, which in turn will reduce the burden of HPV-related disease. Current outreach in the United States is based on vaccination against HPV as a means for combating cervical cancer in women. If we are to include males, however, new marketing strategies must focus on educating patients about the full range of the vaccine's benefits. Men who have sex with men (MSM) are also unprotected against HPV in the current system. Social considerations alone may not be enough, however, as economic prediction models suggest that the associated costs outweigh the benefits in most circumstances. Taking this into account, our review also considers alternate methods of maximizing prevention of HPV-associated disease. The most prudent programs will include physician involvement in patient education and the implementation of structured vaccination and screening programs. Unfortunately, many countries do not have the necessary resources to undertake national vaccination programs. HPV testing and cytology screening for women and MSM may be the most financially reasonable option for many countries.

  17. Background field removal technique based on non-regularized variable kernels sophisticated harmonic artifact reduction for phase data for quantitative susceptibility mapping.

    PubMed

    Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2018-06-11

    We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.

  18. Projecting future air pollution-related mortality under a changing climate: progress, uncertainties and research needs.

    PubMed

    Madaniyazi, Lina; Guo, Yuming; Yu, Weiwei; Tong, Shilu

    2015-02-01

    Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  19. An anisotropic diffusion method for denoising dynamic susceptibility contrast-enhanced magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Murase, Kenya; Yamazaki, Youichi; Shinohara, Masaaki; Kawakami, Kazunori; Kikuchi, Keiichi; Miki, Hitoshi; Mochizuki, Teruhito; Ikezoe, Junpei

    2001-10-01

    The purpose of this study was to present an application of a novel denoising technique for improving the accuracy of cerebral blood flow (CBF) images generated from dynamic susceptibility contrast-enhanced magnetic resonance imaging (DSC-MRI). The method presented in this study was based on anisotropic diffusion (AD). The usefulness of this method was firstly investigated using computer simulations. We applied this method to patient data acquired using a 1.5 T MR system. After a bolus injection of Gd-DTPA, we obtained 40-50 dynamic images with a 1.32-2.08 s time resolution in 4-6 slices. The dynamic images were processed using the AD method, and then the CBF images were generated using pixel-by-pixel deconvolution analysis. For comparison, the CBF images were also generated with or without processing the dynamic images using a median or Gaussian filter. In simulation studies, the standard deviation of the CBF values obtained after processing by the AD method was smaller than that of the CBF values obtained without any processing, while the mean value agreed well with the true CBF value. Although the median and Gaussian filters also reduced image noise, the mean CBF values were considerably underestimated compared with the true values. Clinical studies also suggested that the AD method was capable of reducing the image noise while preserving the quantitative accuracy of CBF images. In conclusion, the AD method appears useful for denoising DSC-MRI, which will make the CBF images generated from DSC-MRI more reliable.

  20. Active learning based segmentation of Crohns disease from abdominal MRI.

    PubMed

    Mahapatra, Dwarikanath; Vos, Franciscus M; Buhmann, Joachim M

    2016-05-01

    This paper proposes a novel active learning (AL) framework, and combines it with semi supervised learning (SSL) for segmenting Crohns disease (CD) tissues from abdominal magnetic resonance (MR) images. Robust fully supervised learning (FSL) based classifiers require lots of labeled data of different disease severities. Obtaining such data is time consuming and requires considerable expertise. SSL methods use a few labeled samples, and leverage the information from many unlabeled samples to train an accurate classifier. AL queries labels of most informative samples and maximizes gain from the labeling effort. Our primary contribution is in designing a query strategy that combines novel context information with classification uncertainty and feature similarity. Combining SSL and AL gives a robust segmentation method that: (1) optimally uses few labeled samples and many unlabeled samples; and (2) requires lower training time. Experimental results show our method achieves higher segmentation accuracy than FSL methods with fewer samples and reduced training effort. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Theoretical and software considerations for general dynamic analysis using multilevel substructured models

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1985-01-01

    The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.

  2. Clustering Categorical Data Using Community Detection Techniques

    PubMed Central

    2017-01-01

    With the advent of the k-modes algorithm, the toolbox for clustering categorical data has an efficient tool that scales linearly in the number of data items. However, random initialization of cluster centers in k-modes makes it hard to reach a good clustering without resorting to many trials. Recently proposed methods for better initialization are deterministic and reduce the clustering cost considerably. A variety of initialization methods differ in how the heuristics chooses the set of initial centers. In this paper, we address the clustering problem for categorical data from the perspective of community detection. Instead of initializing k modes and running several iterations, our scheme, CD-Clustering, builds an unweighted graph and detects highly cohesive groups of nodes using a fast community detection technique. The top-k detected communities by size will define the k modes. Evaluation on ten real categorical datasets shows that our method outperforms the existing initialization methods for k-modes in terms of accuracy, precision, and recall in most of the cases. PMID:29430249

  3. Region-confined restoration method for motion-blurred star image of the star sensor under dynamic conditions.

    PubMed

    Ma, Liheng; Bernelli-Zazzera, Franco; Jiang, Guangwen; Wang, Xingshu; Huang, Zongsheng; Qin, Shiqiao

    2016-06-10

    Under dynamic conditions, the centroiding accuracy of the motion-blurred star image decreases and the number of identified stars reduces, which leads to the degradation of the attitude accuracy of the star sensor. To improve the attitude accuracy, a region-confined restoration method, which concentrates on the noise removal and signal to noise ratio (SNR) improvement of the motion-blurred star images, is proposed for the star sensor under dynamic conditions. A multi-seed-region growing technique with the kinematic recursive model for star image motion is given to find the star image regions and to remove the noise. Subsequently, a restoration strategy is employed in the extracted regions, taking the time consumption and SNR improvement into consideration simultaneously. Simulation results indicate that the region-confined restoration method is effective in removing noise and improving the centroiding accuracy. The identification rate and the average number of identified stars in the experiments verify the advantages of the region-confined restoration method.

  4. Improvement of Vehicle Positioning Using Car-to-Car Communications in Consideration of Communication Delay

    NASA Astrophysics Data System (ADS)

    Hontani, Hidekata; Higuchi, Yuya

    In this article, we propose a vehicle positioning method that can estimate positions of cars even in areas where the GPS is not available. For the estimation, each car measures the relative distance to a car running in front, communicates the measurements with other cars, and uses the received measurements for estimating its position. In order to estimate the position even if the measurements are received with time-delay, we employed the time-delay tolerant Kalman filtering. For sharing the measurements, it is assumed that a car-to-car communication system is used. Then, the measurements sent from farther cars are received with larger time-delay. It follows that the accuracy of the estimates of farther cars become worse. Hence, the proposed method manages only the states of nearby cars to reduce computing effort. The authors simulated the proposed filtering method and found that the proposed method estimates the positions of nearby cars as accurate as the distributed Kalman filtering.

  5. An optimization method for defects reduction in fiber laser keyhole welding

    NASA Astrophysics Data System (ADS)

    Ai, Yuewei; Jiang, Ping; Shao, Xinyu; Wang, Chunming; Li, Peigen; Mi, Gaoyang; Liu, Yang; Liu, Wei

    2016-01-01

    Laser welding has been widely used in automotive, power, chemical, nuclear and aerospace industries. The quality of welded joints is closely related to the existing defects which are primarily determined by the welding process parameters. This paper proposes a defects optimization method that takes the formation mechanism of welding defects and weld geometric features into consideration. The analysis of welding defects formation mechanism aims to investigate the relationship between welding defects and process parameters, and weld features are considered to identify the optimal process parameters for the desired welded joints with minimum defects. The improved back-propagation neural network possessing good modeling for nonlinear problems is adopted to establish the mathematical model and the obtained model is solved by genetic algorithm. The proposed method is validated by macroweld profile, microstructure and microhardness in the confirmation tests. The results show that the proposed method is effective at reducing welding defects and obtaining high-quality joints for fiber laser keyhole welding in practical production.

  6. The Shortlist Method for fast computation of the Earth Mover's Distance and finding optimal solutions to transportation problems.

    PubMed

    Gottschlich, Carsten; Schuhmacher, Dominic

    2014-01-01

    Finding solutions to the classical transportation problem is of great importance, since this optimization problem arises in many engineering and computer science applications. Especially the Earth Mover's Distance is used in a plethora of applications ranging from content-based image retrieval, shape matching, fingerprint recognition, object tracking and phishing web page detection to computing color differences in linguistics and biology. Our starting point is the well-known revised simplex algorithm, which iteratively improves a feasible solution to optimality. The Shortlist Method that we propose substantially reduces the number of candidates inspected for improving the solution, while at the same time balancing the number of pivots required. Tests on simulated benchmarks demonstrate a considerable reduction in computation time for the new method as compared to the usual revised simplex algorithm implemented with state-of-the-art initialization and pivot strategies. As a consequence, the Shortlist Method facilitates the computation of large scale transportation problems in viable time. In addition we describe a novel method for finding an initial feasible solution which we coin Modified Russell's Method.

  7. Exact and Metaheuristic Approaches for a Bi-Objective School Bus Scheduling Problem.

    PubMed

    Chen, Xiaopan; Kong, Yunfeng; Dang, Lanxue; Hou, Yane; Ye, Xinyue

    2015-01-01

    As a class of hard combinatorial optimization problems, the school bus routing problem has received considerable attention in the last decades. For a multi-school system, given the bus trips for each school, the school bus scheduling problem aims at optimizing bus schedules to serve all the trips within the school time windows. In this paper, we propose two approaches for solving the bi-objective school bus scheduling problem: an exact method of mixed integer programming (MIP) and a metaheuristic method which combines simulated annealing with local search. We develop MIP formulations for homogenous and heterogeneous fleet problems respectively and solve the models by MIP solver CPLEX. The bus type-based formulation for heterogeneous fleet problem reduces the model complexity in terms of the number of decision variables and constraints. The metaheuristic method is a two-stage framework for minimizing the number of buses to be used as well as the total travel distance of buses. We evaluate the proposed MIP and the metaheuristic method on two benchmark datasets, showing that on both instances, our metaheuristic method significantly outperforms the respective state-of-the-art methods.

  8. Springback compensation for a vehicle's steel body panel

    NASA Astrophysics Data System (ADS)

    Bałon, Paweł; Świątoniowski, Andrzej; Szostak, Janusz; Kiełbasa, Bartłomiej

    2017-10-01

    This paper presents a structural element of a vehicle, that is made from High Strength Steels. Application of this kind of materials considerably reduces construction mass due to high durability. Nevertheless, it results in appearance of springback that depends mainly on used material as well as part. Springback reduction helps to reach the reference geometry of the element by using the Finite Element Method software. Authors compared two methods of optimization of die shape. The first method defines the compensation of the die shape only for OP-20 and the second multi-operation method defines the compensation of the die shape for the OP-20 and OP-50 operations. Prediction of springback by the trial-and-error method is difficult and labor-intensive. Designing of dies requires using of appropriate FEM software to make them more economic and less time-consuming. Virtual compensation methods make it possible to receive precise result in a short time. Die compensation with software application was experimentally verified by the prototype die. Therefore, springback deformation becomes a critical problem especially for the HSS steel when the geometry is complex.

  9. The Shortlist Method for Fast Computation of the Earth Mover's Distance and Finding Optimal Solutions to Transportation Problems

    PubMed Central

    Gottschlich, Carsten; Schuhmacher, Dominic

    2014-01-01

    Finding solutions to the classical transportation problem is of great importance, since this optimization problem arises in many engineering and computer science applications. Especially the Earth Mover's Distance is used in a plethora of applications ranging from content-based image retrieval, shape matching, fingerprint recognition, object tracking and phishing web page detection to computing color differences in linguistics and biology. Our starting point is the well-known revised simplex algorithm, which iteratively improves a feasible solution to optimality. The Shortlist Method that we propose substantially reduces the number of candidates inspected for improving the solution, while at the same time balancing the number of pivots required. Tests on simulated benchmarks demonstrate a considerable reduction in computation time for the new method as compared to the usual revised simplex algorithm implemented with state-of-the-art initialization and pivot strategies. As a consequence, the Shortlist Method facilitates the computation of large scale transportation problems in viable time. In addition we describe a novel method for finding an initial feasible solution which we coin Modified Russell's Method. PMID:25310106

  10. Features calibration of the dynamic force transducers

    NASA Astrophysics Data System (ADS)

    Sc., M. Yu Prilepko D.; Lysenko, V. G.

    2018-04-01

    The article discusses calibration methods of dynamic forces measuring instruments. The relevance of work is dictated by need to valid definition of the dynamic forces transducers metrological characteristics taking into account their intended application. The aim of this work is choice justification of calibration method, which provides the definition dynamic forces transducers metrological characteristics under simulation operating conditions for determining suitability for using in accordance with its purpose. The following tasks are solved: the mathematical model and the main measurements equation of calibration dynamic forces transducers by load weight, the main budget uncertainty components of calibration are defined. The new method of dynamic forces transducers calibration with use the reference converter “force-deformation” based on the calibrated elastic element and measurement of his deformation by a laser interferometer is offered. The mathematical model and the main measurements equation of the offered method is constructed. It is shown that use of calibration method based on measurements by the laser interferometer of calibrated elastic element deformations allows to exclude or to considerably reduce the uncertainty budget components inherent to method of load weight.

  11. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 1. Mathematical models, computing methods, and results. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less

  12. Health care worker perspectives of their motivation to reduce health care-associated infections.

    PubMed

    McClung, Laura; Obasi, Chidi; Knobloch, Mary Jo; Safdar, Nasia

    2017-10-01

    Health care-associated infections (HAIs) are largely preventable, but are associated with considerable health care burden. Given the significant cost of HAIs, many health care institutions have implemented bundled interventions to reduce HAIs. These complex behavioral interventions require considerable effort; however, individual behaviors and motivations crucial to successful and sustained implementation have not been adequately assessed. We evaluated health care worker motivations to reduce HAIs. This was a phenomenologic qualitative study of health care workers in different roles within a university hospital, recruited via a snowball strategy. Using constructs from the Consolidated Framework for Implementation Research model, face-to-face semi-structured interviews were used to explore perceptions of health care worker motivation to follow protocols on HAI prevention. Across all types of health care workers interviewed, patient safety and improvement in clinical outcomes were the major motivators to reducing HAIs. Other important motivators included collaborative environment that valued individual input, transparency and feedback at both organizational and individual levels, leadership involvement, and refresher trainings and workshops. We did not find policy, regulatory considerations, or financial penalties to be important motivators. Health care workers perceived patient safety and clinical outcomes as the primary motivators to reduce HAI. Leadership engagement and data-driven interventions with frequent performance feedback were also identified as important facilitators of HAI prevention. Published by Elsevier Inc.

  13. Using in vitro/in silico data for consumer safety assessment of feed flavoring additives--A feasibility study using piperine.

    PubMed

    Thiel, A; Etheve, S; Fabian, E; Leeman, W R; Plautz, J R

    2015-10-01

    Consumer health risk assessment for feed additives is based on the estimated human exposure to the additive that may occur in livestock edible tissues compared to its hazard. We present an approach using alternative methods for consumer health risk assessment. The aim was to use the fewest possible number of animals to estimate its hazard and human exposure without jeopardizing the safety upon use. As an example we selected the feed flavoring substance piperine and applied in silico modeling for residue estimation, results from literature surveys, and Read-Across to assess metabolism in different species. Results were compared to experimental in vitro metabolism data in rat and chicken, and to quantitative analysis of residues' levels from the in vivo situation in livestock. In silico residue modeling showed to be a worst case: the modeled residual levels were considerably higher than the measured residual levels. The in vitro evaluation of livestock versus rodent metabolism revealed no major differences in metabolism between the species. We successfully performed a consumer health risk assessment without performing additional animal experiments. As shown, the use and combination of different alternative methods supports animal welfare consideration and provides future perspective to reducing the number of animals. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Iron oxide nanotubes synthesized via template-based electrodeposition

    NASA Astrophysics Data System (ADS)

    Lim, Jin-Hee; Min, Seong-Gi; Malkinski, Leszek; Wiley, John B.

    2014-04-01

    Considerable effort has been invested in the development of synthetic methods for the preparation iron oxide nanostructures for applications in nanotechnology. While a variety of structures have been reported, only a few studies have focused on iron oxide nanotubes. Here, we present details on the synthesis and characterization of iron oxide nanotubes along with a proposed mechanism for FeOOH tube formation. The FeOOH nanotubes, fabricated via a template-based electrodeposition method, are found to exhibit a unique inner-surface. Heat treatment of these tubes under oxidizing or reducing atmospheres can produce either hematite (α-Fe2O3) or magnetite (Fe3O4) structures, respectively. Hematite nanotubes are composed of small nanoparticles less than 20 nm in diameter and the magnetization curves and FC-ZFC curves show superparamagnetic properties without the Morin transition. In the case of magnetite nanotubes, which consist of slightly larger nanoparticles, magnetization curves show ferromagnetism with weak coercivity at room temperature, while FC-ZFC curves exhibit the Verwey transition at 125 K.Considerable effort has been invested in the development of synthetic methods for the preparation iron oxide nanostructures for applications in nanotechnology. While a variety of structures have been reported, only a few studies have focused on iron oxide nanotubes. Here, we present details on the synthesis and characterization of iron oxide nanotubes along with a proposed mechanism for FeOOH tube formation. The FeOOH nanotubes, fabricated via a template-based electrodeposition method, are found to exhibit a unique inner-surface. Heat treatment of these tubes under oxidizing or reducing atmospheres can produce either hematite (α-Fe2O3) or magnetite (Fe3O4) structures, respectively. Hematite nanotubes are composed of small nanoparticles less than 20 nm in diameter and the magnetization curves and FC-ZFC curves show superparamagnetic properties without the Morin transition. In the case of magnetite nanotubes, which consist of slightly larger nanoparticles, magnetization curves show ferromagnetism with weak coercivity at room temperature, while FC-ZFC curves exhibit the Verwey transition at 125 K. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr06924a

  15. Exploring multi-scale forest above ground biomass estimation with optical remote sensing imageries

    NASA Astrophysics Data System (ADS)

    Koju, U.; Zhang, J.; Gilani, H.

    2017-02-01

    Forest shares 80% of total exchange of carbon between the atmosphere and the terrestrial ecosystem. Due to this monitoring of forest above ground biomass (as carbon can be calculated as 0.47 part of total biomass) has become very important. Forest above ground biomass as being the major portion of total forest biomass should be given a very careful consideration in its estimation. It is hoped to be useful in addressing the ongoing problems of deforestation and degradation and to gain carbon mitigation benefits through mechanisms like Reducing Emissions from Deforestation and Forest Degradation (REDD+). Many methods of above ground biomass estimation are in used ranging from use of optical remote sensing imageries of very high to very low resolution to SAR data and LIDAR. This paper describes a multi-scale approach for assessing forest above ground biomass, and ultimately carbon stocks, using very high imageries, open source medium resolution and medium resolution satellite datasets with a very limited number of field plots. We found this method is one of the most promising method for forest above ground biomass estimation with higher accuracy and low cost budget. Pilot study was conducted in Chitwan district of Nepal on the estimation of biomass using this technique. The GeoEye-1 (0.5m), Landsat (30m) and Google Earth (GE) images were used remote sensing imageries. Object-based image analysis (OBIA) classification technique was done on Geo-eye imagery for the tree crown delineation at the watershed level. After then, crown projection area (CPA) vs. biomass model was developed and validated at the watershed level. Open source GE imageries were used to calculate the CPA and biomass from virtual plots at district level. Using data mining technique, different parameters from Landsat imageries along with the virtual sample biomass were used for upscaling biomass estimation at district level. We found, this approach can considerably reduce field data requirements for estimation of biomass and carbon in comparison with inventory methods based on enumeration of all trees in a plot. The proposed methodology is very cost effective and can be replicated with limited resources and time.

  16. Reduced glutathione enhances fertility of frozen/thawed C57BL/6 mouse sperm after exposure to methyl-beta-cyclodextrin.

    PubMed

    Takeo, Toru; Nakagata, Naomi

    2011-11-01

    Sperm cryopreservation is useful for the effective storage of genomic resources derived from genetically engineered mice. However, freezing the sperm of C57BL/6 mice, the most commonly used genetic background for genetically engineered mice, considerably reduces its fertility. We previously reported that methyl-beta-cyclodextrin dramatically improved the fertility of frozen/thawed C57BL/6 mouse sperm. Recently, it was reported that exposing sperm to reduced glutathione may alleviate oxidative stress in frozen/thawed mouse sperm, thereby enhancing in vitro fertilization (IVF); however, the mechanism underlying this effect is poorly understood. In the present study, we examined the combined effects of methyl-beta-cyclodextrin and reduced glutathione on the fertilization rate of IVF with frozen/thawed C57BL/6 mouse sperm and the characteristic changes in the zona pellucida induced by reduced glutathione. Adding reduced glutathione to the fertilization medium increased the fertilization rate. Methyl-beta-cyclodextrin and reduced glutathione independently increased fertilization rates, and their combination produced the strongest effect. We found that reduced glutathione increased the amount of free thiols in the zona pellucida and promoted zona pellucida enlargement. Finally, 2-cell embryos produced by IVF with the addition of reduced glutathione developed normally and produced live offspring. In summary, we have established a novel IVF method using methyl-beta-cyclodextrin during sperm preincubation and reduced glutathione during the IVF procedure to enhance fertility of frozen/thawed C57BL/6 mouse sperm.

  17. Snakebite and Its Socio-Economic Impact on the Rural Population of Tamil Nadu, India

    PubMed Central

    Vaiyapuri, Sakthivel; Vaiyapuri, Rajendran; Ashokan, Rajesh; Ramasamy, Karthikeyan; Nattamaisundar, Kameshwaran; Jeyaraj, Anburaj; Chandran, Viswanathan; Gajjeraman, Prabu; Baksh, M. Fazil; Gibbins, Jonathan M.; Hutchinson, E. Gail

    2013-01-01

    Background Snakebite represents a significant health issue worldwide, affecting several million people each year with as many as 95,000 deaths. India is considered to be the country most affected, but much remains unknown about snakebite incidence in this country, its socio-economic impact and how snakebite management could be improved. Methods/Principal Findings We conducted a study within rural villages in Tamil Nadu, India, which combines a household survey (28,494 people) of snakebite incidence with a more detailed survey of victims in order to understand the health and socio-economic effects of the bite, the treatments obtained and their views about future improvements. Our survey suggests that snakebite incidence is higher than previously reported. 3.9% of those surveyed had suffered from snakebite and the number of deaths corresponds to 0.45% of the population. The socio-economic impact of this is very considerable in terms of the treatment costs and the long-term effects on the health and ability of survivors to work. To reduce this, the victims recommended improvements to the accessibility and affordability of antivenom treatment. Conclusions Snakebite has a considerable and disproportionate impact on rural populations, particularly in South Asia. This study provides an incentive for researchers and the public to work together to reduce the incidence and improve the outcomes for snake bite victims and their families. PMID:24278244

  18. No time to waste organic waste: Nanosizing converts remains of food processing into refined materials.

    PubMed

    Griffin, Sharoon; Sarfraz, Muhammad; Farida, Verda; Nasim, Muhammad Jawad; Ebokaiwe, Azubuike P; Keck, Cornelia M; Jacob, Claus

    2018-03-15

    Modern food processing results in considerable amounts of side-products, such as grape seeds, walnut shells, spent coffee grounds, and harvested tomato plants. These materials are still rich in valuable and biologically active substances and therefore of interest from the perspective of waste management and "up-cycling". In contrast to traditional, often time consuming and low-value uses, such as vermicomposting and anaerobic digestion, the complete conversion into nanosuspensions unlocks considerable potentials of and new applications for such already spent organic materials without the need of extraction and without producing any additional waste. In this study, nanosuspensions were produced using a sequence of milling and homogenization methods, including High Speed Stirring (HSS) and High Pressure Homogenization (HPH) which reduced the size of the particles to 200-400 nm. The resulting nanosuspensions demonstrated nematicidal and antimicrobial activity and their antioxidant activities exceeded the ones of the bulk materials. In the future, this simple nanosizing approach may fulfil several important objectives, such as reducing and turning readily available waste into new value and eventually closing a crucial cycle of agricultural products returning to their fields - with a resounding ecological impact in the fields of medicine, agriculture, cosmetics and fermentation. Moreover, up-cycling via nanosizing adds an economical promise of increased value to residue-free waste management. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Liquid crystal point diffraction interferometer. Ph.D. Thesis - Arizona Univ., 1995

    NASA Technical Reports Server (NTRS)

    Mercer, Carolyn R.

    1995-01-01

    A new instrument, the liquid crystal point diffraction-interferometer (LCPDI), has been developed for the measurement of phase objects. This instrument maintains the compact, robust design of Linnik's point diffraction interferometer (PDI) and adds to it phase stepping capability for quantitative interferogram analysis. The result is a compact, simple to align, environmentally insensitive interferometer capable of accurately measuring optical wavefronts with very high data density and with automated data reduction. This dissertation describes the theory of both the PDI and liquid crystal phase control. The design considerations for the LCPDI are presented, including manufacturing considerations. The operation and performance of the LCPDI are discussed, including sections regarding alignment, calibration, and amplitude modulation effects. The LCPDI is then demonstrated using two phase objects: defocus difference wavefront, and a temperature distribution across a heated chamber filled with silicone oil. The measured results are compared to theoretical or independently measured results and show excellent agreement. A computer simulation of the LCPDI was performed to verify the source of observed periodic phase measurement error. The error stems from intensity variations caused by dye molecules rotating within the liquid crystal layer. Methods are discussed for reducing this error. Algorithms are presented which reduce this error; they are also useful for any phase-stepping interferometer that has unwanted intensity fluctuations, such as those caused by unregulated lasers.

  20. Indomethacin nanocrystals prepared by different laboratory scale methods: effect on crystalline form and dissolution behavior

    NASA Astrophysics Data System (ADS)

    Martena, Valentina; Censi, Roberta; Hoti, Ela; Malaj, Ledjan; Di Martino, Piera

    2012-12-01

    The objective of this study is to select very simple and well-known laboratory scale methods able to reduce particle size of indomethacin until the nanometric scale. The effect on the crystalline form and the dissolution behavior of the different samples was deliberately evaluated in absence of any surfactants as stabilizers. Nanocrystals of indomethacin (native crystals are in the γ form) (IDM) were obtained by three laboratory scale methods: A (Batch A: crystallization by solvent evaporation in a nano-spray dryer), B (Batch B-15 and B-30: wet milling and lyophilization), and C (Batch C-20-N and C-40-N: Cryo-milling in the presence of liquid nitrogen). Nanocrystals obtained by the method A (Batch A) crystallized into a mixture of α and γ polymorphic forms. IDM obtained by the two other methods remained in the γ form and a different attitude to the crystallinity decrease were observed, with a more considerable decrease in crystalline degree for IDM milled for 40 min in the presence of liquid nitrogen. The intrinsic dissolution rate (IDR) revealed a higher dissolution rate for Batches A and C-40-N, due to the higher IDR of α form than γ form for the Batch A, and the lower crystallinity degree for both the Batches A and C-40-N. These factors, as well as the decrease in particle size, influenced the IDM dissolution rate from the particle samples. Modifications in the solid physical state that may occur using different particle size reduction treatments have to be taken into consideration during the scale up and industrial development of new solid dosage forms.

  1. Study of a reinforced concrete beam strengthened using a combination of SMA wire and CFRP plate

    NASA Astrophysics Data System (ADS)

    Liu, Zhi-qiang; Li, Hui

    2006-03-01

    Traditional methods used for strengthening of reinforced concrete (RC) structures, such as bonding of steel plates, suffer from inherent disadvantages. In recent years, strengthening of RC structures using carbon fiber reinforced polymer (CFRP) plates has attracted considerable attentions around the world. Most existing research on CFRP plate bonding for flexural strengthening of RC beams has been carried out for the strength enhancement. However, little research is focused on effect of residual deformations on the strengthening. The residual deformations have an important effect on the strengthening by CFRP plates. There exists a very significant challenge how the residual deformations are reduced. Shape memory alloy (SMA) has showed outstanding functional properties as an actuator. It is a possibility that SMA can be used to reduce the residual deformation and make cracks of concrete close by imposing the recovery forces on the concrete in the tensile zone. It is only an emergency damage repair since the SMA wires need to be heated continuously. So, an innovative method of a RC beam strengthened by CFRP plates in combination with SMA wires was first investigated experimentally in this paper. In addition, the nonlinear finite element software of ABAQUS was employed to further simulate the behavior of RC beams strengthened through the new strengthening method. It can be found that this is an excellent and effective strengthening method.

  2. Fusion of digital breast tomosynthesis images via wavelet synthesis for improved lesion conspicuity

    NASA Astrophysics Data System (ADS)

    Hariharan, Harishwaran; Pomponiu, Victor; Zheng, Bin; Whiting, Bruce; Gur, David

    2014-03-01

    Full-field digital mammography (FFDM) is the most common screening procedure for detecting early breast cancer. However, due to complications such as overlapping breast tissue in projection images, the efficacy of FFDM reading is reduced. Recent studies have shown that digital breast tomosynthesis (DBT), in combination with FFDM, increases detection sensitivity considerably while decreasing false-positive, recall rates. There is a huge interest in creating diagnostically accurate 2-D interpretations from the DBT slices. Most of the 2-D syntheses rely on visualizing the maximum intensities (brightness) from each slice through different methods. We propose a wavelet based fusion method, where we focus on preserving holistic information from larger structures such as masses while adding high frequency information that is relevant and helpful for diagnosis. This method enables the spatial generation of a 2D image from a series of DBT images, each of which contains both smooth and coarse structures distributed in the wavelet domain. We believe that the wavelet-synthesized images, generated from their DBT image datasets, provide radiologists with improved lesion and micro-calcification conspicuity as compared with FFDM images. The potential impact of this fusion method is (1) Conception of a device-independent, data-driven modality that increases the conspicuity of lesions, thereby facilitating early detection and potentially reducing recall rates; (2) Reduction of the accompanying radiation dose to the patient.

  3. Application of L1-norm regularization to epicardial potential reconstruction based on gradient projection.

    PubMed

    Wang, Liansheng; Qin, Jing; Wong, Tien Tsin; Heng, Pheng Ann

    2011-10-07

    The epicardial potential (EP)-targeted inverse problem of electrocardiography (ECG) has been widely investigated as it is demonstrated that EPs reflect underlying myocardial activity. It is a well-known ill-posed problem as small noises in input data may yield a highly unstable solution. Traditionally, L2-norm regularization methods have been proposed to solve this ill-posed problem. But the L2-norm penalty function inherently leads to considerable smoothing of the solution, which reduces the accuracy of distinguishing abnormalities and locating diseased regions. Directly using the L1-norm penalty function, however, may greatly increase computational complexity due to its non-differentiability. We propose an L1-norm regularization method in order to reduce the computational complexity and make rapid convergence possible. Variable splitting is employed to make the L1-norm penalty function differentiable based on the observation that both positive and negative potentials exist on the epicardial surface. Then, the inverse problem of ECG is further formulated as a bound-constrained quadratic problem, which can be efficiently solved by gradient projection in an iterative manner. Extensive experiments conducted on both synthetic data and real data demonstrate that the proposed method can handle both measurement noise and geometry noise and obtain more accurate results than previous L2- and L1-norm regularization methods, especially when the noises are large.

  4. Novel method to leukoreduce murine blood for transfusion: how to reduce animal usage.

    PubMed

    Fischer, Dania; Büssow, Julian; Meybohm, Patrick; Zacharowski, Kai; Jennewein, Carla

    2016-01-01

    Basic research on the pathomechanisms of transfusion-related adverse events depends on murine transfusion models, in which leukoreduction (LR) is a prevalent standard. The commonly used neonatal LR filter (LRF) is associated with considerable animal numbers. A more efficient method would help support the guiding principles of "replacement, reduction, refinement" (3Rs). Blood from C57BL/6 and C57BL/6-Tg(UBC-GFP)30Scha/J mice was leukoreduced using (1) a neonatal LRF, (2) a syringe LRF, or (3) CD45 microbeads. Product quality was assessed according to US Food and Drug Administration (FDA) standards. White blood cell numbers were analyzed by flow cytometry; hemoglobin concentrations and hematocrit were measured and in vivo posttransfusion recoveries were determined after 2 weeks of storage. Using the neonatal filter, a LR of 99.56% was achieved with wastage of 12.4 mL in comparison to 99.68% and 1-mL hold-up volume with the syringe filter and 99.11 ± 0.24% LR and 0.1-mL wastage using microbeads. All techniques achieved FDA quality standards, apart from posttransfusion recovery rate, which was only reached by the microbeads-based technique. LR with CD45 microbeads not only reduces animal usage but also provides a more efficacious method regarding posttransfusion red blood cell recovery and, hence, provides a promising alternative to commonly used methods. © 2015 AABB.

  5. A Novel Method Using Abstract Convex Underestimation in Ab-Initio Protein Structure Prediction for Guiding Search in Conformational Feature Space.

    PubMed

    Hao, Xiao-Hu; Zhang, Gui-Jun; Zhou, Xiao-Gen; Yu, Xu-Feng

    2016-01-01

    To address the searching problem of protein conformational space in ab-initio protein structure prediction, a novel method using abstract convex underestimation (ACUE) based on the framework of evolutionary algorithm was proposed. Computing such conformations, essential to associate structural and functional information with gene sequences, is challenging due to the high-dimensionality and rugged energy surface of the protein conformational space. As a consequence, the dimension of protein conformational space should be reduced to a proper level. In this paper, the high-dimensionality original conformational space was converted into feature space whose dimension is considerably reduced by feature extraction technique. And, the underestimate space could be constructed according to abstract convex theory. Thus, the entropy effect caused by searching in the high-dimensionality conformational space could be avoided through such conversion. The tight lower bound estimate information was obtained to guide the searching direction, and the invalid searching area in which the global optimal solution is not located could be eliminated in advance. Moreover, instead of expensively calculating the energy of conformations in the original conformational space, the estimate value is employed to judge if the conformation is worth exploring to reduce the evaluation time, thereby making computational cost lower and the searching process more efficient. Additionally, fragment assembly and the Monte Carlo method are combined to generate a series of metastable conformations by sampling in the conformational space. The proposed method provides a novel technique to solve the searching problem of protein conformational space. Twenty small-to-medium structurally diverse proteins were tested, and the proposed ACUE method was compared with It Fix, HEA, Rosetta and the developed method LEDE without underestimate information. Test results show that the ACUE method can more rapidly and more efficiently obtain the near-native protein structure.

  6. Training the Developing Brain Part II: Cognitive Considerations for Youth Instruction and Feedback

    PubMed Central

    Kushner, Adam M.; Kiefer, Adam W.; Lesnick, Samantha; Faigenbaum, Avery D.; Kashikar-Zuck, Susmita; Myer, Gregory D.

    2015-01-01

    Growing numbers of youth participating in competitive, organized physical activity has led to a concern for the risk of sports related injuries during important periods of human development. Recent studies have demonstrated the ability of Integrative Neuromuscular Training (INT) to enhance athletic performance and to reduce the risk of sports related injuries in youth. Successful implementation of INT necessitates instruction from knowledgeable and qualified instructors who understand the unique physical, cognitive and psychosocial characteristics of youth to provide appropriate training instruction and feedback. Principles of a classical theory of cognitive development provide a useful context for discussion of developmentally appropriate methods and strategies for INT instruction of youth. INT programs that consider these developmentally appropriate approaches will provide a controlled, efficacious environment for youth to improve athletic performance and to reduce risk of sports related injury; thus, promoting a healthy, active lifestyle beyond an individual’s formative years. PMID:25968858

  7. Methods for testing Zernike phase plates and a report on silicon-based phase plates with reduced charging and improved ageing characteristics

    PubMed Central

    Marko, Michael; Meng, Xing; Hsieh, Chyongere; Roussie, James; Striemer, Christopher

    2013-01-01

    Imaging with Zernike phase plates is increasingly being used in cryo-TEM tomography and cryo-EM single-particle applications. However, rapid ageing of the phase plates, together with the cost and effort in producing them, present serious obstacles to widespread adoption. We are experimenting with phase plates based on silicon chips that have thin windows; such phase plates could be mass-produced and made available at moderate cost. The windows are coated with conductive layers to reduce charging, and this considerably extends the useful life of the phase plates compared to traditional pure-carbon phase plates. However, a compromise must be reached between robustness and transmission through the phase-plate film. Details are given on testing phase-plate performance by means of imaging an amorphous thin film and evaluating the power spectra of the images. PMID:23994351

  8. Interactive data collection: benefits of integrating new media into pediatric research.

    PubMed

    Kennedy, Christine; Charlesworth, Annemarie; Chen, Jyu-Lin

    2003-01-01

    Despite the prevalence of children's computerized games for recreational and educational purposes, the use of interactive technology to obtain pediatric research data remains underexplored. This article describes the development of laptop interactive data collection (IDC) software for a children's health intervention study. The IDC integrates computer technology, children's developmental needs, and quantitative research methods that are engaging for school-age children as well as reliable and efficient for the pediatric health researcher. Using this methodology, researchers can address common problems such as maintaining a child's attention throughout an assessment session while potentially increasing their response rate and reducing missing data rates. The IDC also promises to produce more reliable data by eliminating the need for manual double entry of data and reducing much of the time and costs associated with data cleaning and management. Development and design considerations and recommendations for further use are discussed.

  9. Bipropellant propulsion with reciprocating pumps

    NASA Astrophysics Data System (ADS)

    Whitehead, John C.

    1993-06-01

    A pressure regulated gas generator rocket cycle with alternately pressurized pairs of reciprocating pumps offers thrust-on-demand operation with significantly lower inert mass than conventional spacecraft liquid propulsion systems. The operation of bipropellant feed systems with reciprocating pumps is explained, with consideration for both short and long term missions. There are several methods for startup and shutdown of this self-starting pump-fed system, with preference determined by thrust duty cycle and mission duration. Progress to date includes extensive development testing of components unique to this type of system, and several live tests with monopropellant hydrazine. Pneumatic pump control valves which render pistons and bellows automatically responsive to downstream liquid demand are significantly simpler than those described previously. A compact pumpset mounted to central liquid manifolds has a pair of oxidizer pumps pneumatically slaved to a pair of fuel pumps to reduce vibration. A warm gas pressure reducer for tank expulsion can eliminate any remaining need for inert gas storage.

  10. Causes and Solutions for High Energy Consumption in Traditional Buildings Located in Hot Climate Regions

    NASA Astrophysics Data System (ADS)

    Barayan, Olfat Mohammad

    A considerable amount of money for high-energy consumption is spent in traditional buildings located in hot climate regions. High-energy consumption is significantly influenced by several causes, including building materials, orientation, mass, and openings' sizes. This paper aims to identify these causes and find practical solutions to reduce the annual cost of bills. For the purpose of this study, simulation research method has been followed. A comparison between two Revit models has also been created to point out the major cause of high-energy consumption. By analysing different orientations, wall insulation, and window glazing and applying some other high performance building techniques, a conclusion was found to confirm that appropriate building materials play a vital role in affecting energy cost. Therefore, the ability to reduce the energy cost by more than 50% in traditional buildings depends on a careful balance of building materials, mass, orientation, and type of window glazing.

  11. A multiple indicator, multiple cause method for representing social capital with an application to psychological distress

    NASA Astrophysics Data System (ADS)

    Congdon, Peter

    2010-03-01

    This paper describes a structural equation methodology for obtaining social capital scores for survey subjects from multiple indicators of social support, neighbourhood and trust perceptions, and memberships of organizations. It adjusts for variation that is likely to occur in levels of social capital according to geographic context (e.g. level of area deprivation, geographic region, level of urbanity) and demographic group. Social capital is used as an explanatory factor for psychological distress using data from the 2006 Health Survey for England. A highly significant effect of social capital in reducing the chance of psychiatric caseness is obtained after controlling for other individual and geographic risk factors. Allowing for social capital has considerable effects on the impacts on psychiatric health of other risk factors. In particular, the impact of area deprivation category is much reduced. There is also evidence of significant differentiation in social capital between population categories and geographic contexts.

  12. High energy X-ray phase and dark-field imaging using a random absorption mask.

    PubMed

    Wang, Hongchang; Kashyap, Yogesh; Cai, Biao; Sawhney, Kawal

    2016-07-28

    High energy X-ray imaging has unique advantage over conventional X-ray imaging, since it enables higher penetration into materials with significantly reduced radiation damage. However, the absorption contrast in high energy region is considerably low due to the reduced X-ray absorption cross section for most materials. Even though the X-ray phase and dark-field imaging techniques can provide substantially increased contrast and complementary information, fabricating dedicated optics for high energies still remain a challenge. To address this issue, we present an alternative X-ray imaging approach to produce transmission, phase and scattering signals at high X-ray energies by using a random absorption mask. Importantly, in addition to the synchrotron radiation source, this approach has been demonstrated for practical imaging application with a laboratory-based microfocus X-ray source. This new imaging method could be potentially useful for studying thick samples or heavy materials for advanced research in materials science.

  13. Design optimization of dual-axis driving mechanism for satellite antenna with two planar revolute clearance joints

    NASA Astrophysics Data System (ADS)

    Bai, Zheng Feng; Zhao, Ji Jun; Chen, Jun; Zhao, Yang

    2018-03-01

    In the dynamic analysis of satellite antenna dual-axis driving mechanism, it is usually assumed that the joints are ideal or perfect without clearances. However, in reality, clearances in joints are unavoidable due to assemblage, manufacturing errors and wear. When clearance is introduced to the mechanism, it will lead to poor dynamic performances and undesirable vibrations due to impact forces in clearance joint. In this paper, a design optimization method is presented to reduce the undesirable vibrations of satellite antenna considering clearance joints in dual-axis driving mechanism. The contact force model in clearance joint is established using a nonlinear spring-damper model and the friction effect is considered using a modified Coulomb friction model. Firstly, the effects of clearances on dynamic responses of satellite antenna are investigated. Then the optimization method for dynamic design of the dual-axis driving mechanism with clearance is presented. The objective of the optimization is to minimize the maximum absolute vibration peak of antenna acceleration by reducing the impact forces in clearance joint. The main consideration here is to optimize the contact parameters of the joint elements. The contact stiffness coefficient, damping coefficient and the dynamic friction coefficient for clearance joint elements are taken as the optimization variables. A Generalized Reduced Gradient (GRG) algorithm is used to solve this highly nonlinear optimization problem for dual-axis driving mechanism with clearance joints. The results show that the acceleration peaks of satellite antenna and contact forces in clearance joints are reduced obviously after design optimization, which contributes to a better performance of the satellite antenna. Also, the application and limitation of the proposed optimization method are discussed.

  14. Selecting Question-Specific Genes to Reduce Incongruence in Phylogenomics: A Case Study of Jawed Vertebrate Backbone Phylogeny.

    PubMed

    Chen, Meng-Yun; Liang, Dan; Zhang, Peng

    2015-11-01

    Incongruence between different phylogenomic analyses is the main challenge faced by phylogeneticists in the genomic era. To reduce incongruence, phylogenomic studies normally adopt some data filtering approaches, such as reducing missing data or using slowly evolving genes, to improve the signal quality of data. Here, we assembled a phylogenomic data set of 58 jawed vertebrate taxa and 4682 genes to investigate the backbone phylogeny of jawed vertebrates under both concatenation and coalescent-based frameworks. To evaluate the efficiency of extracting phylogenetic signals among different data filtering methods, we chose six highly intractable internodes within the backbone phylogeny of jawed vertebrates as our test questions. We found that our phylogenomic data set exhibits substantial conflicting signal among genes for these questions. Our analyses showed that non-specific data sets that are generated without bias toward specific questions are not sufficient to produce consistent results when there are several difficult nodes within a phylogeny. Moreover, phylogenetic accuracy based on non-specific data is considerably influenced by the size of data and the choice of tree inference methods. To address such incongruences, we selected genes that resolve a given internode but not the entire phylogeny. Notably, not only can this strategy yield correct relationships for the question, but it also reduces inconsistency associated with data sizes and inference methods. Our study highlights the importance of gene selection in phylogenomic analyses, suggesting that simply using a large amount of data cannot guarantee correct results. Constructing question-specific data sets may be more powerful for resolving problematic nodes. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Alumina Refinery Wastewater Management: When Zero Discharge Just Isn't Feasible….

    NASA Astrophysics Data System (ADS)

    Martin, Lucy; Howard, Steven

    Management and treatment of liquid effluents are determinant considerations in the design of alumina refineries. Rainfall, evaporation rate, proximity to the coast, process design and layout, ore mineralogy, the local environment, and potential impact on contiguous communities are all integral to the development of an appropriate refinery water management strategy. The goal is to achieve zero discharge of liquid effluent to the environment. However this is not always the most feasible solution under the extreme rainfall conditions in tropical and subtropical locations. This paper will explore the following issues for both inland and coastal refineries: • Methods to reduce and control refinery discharges

  16. [Cranial nerve damage after neuroaxial methods of anesthesia in puerperas].

    PubMed

    Floka, S E; Shifman, E M

    2007-01-01

    The paper describes cranial nerve damage, a rare complication of neuroaxial anesthesia in obstetric care. In the literature, there are summarized data on 17 cases of neurological deficit developing after subarachnoidal or epidural anesthesia in puerperas. The etiological and pathogenetic factors of the above complications may be suggested to be the high disposition of a local anesthetic, arterial hypotension due to neuroaxial anesthetics, the outflow of cerebrospinal fluid after pachymeningeal puncture (including after unintended puncture during epidural anesthesia), and ischemic injury after the blood packing performed to relieve postpuncture headache. Closer consideration of these risk factors seems to reduce the incidence of cranial nerve damage in puerperas.

  17. Terahertz Heterodyne Receiver with an Electron-Heating Mixer and a Heterodyne Based on the Quantum-Cascade Laser

    NASA Astrophysics Data System (ADS)

    Seliverstov, S. V.; Anfertyev, V. A.; Tretyakov, I. V.; Ozheredov, I. A.; Solyankin, P. M.; Revin, L. S.; Vaks, V. L.; Rusova, A. A.; Goltsman, G. N.; Shkurinov, A. P.

    2017-12-01

    We study characteristics of the laboratory prototype of a terahertz heterodyne receiver with an electron-heating mixer and a heterodyne based on the quantum-cascade laser. The results obtained demonstrate the possibility to use this receiver as a basis for creation of a high-sensitivity terahertz spectrometer, which can be used in many basic and practical applications. A significant advantage of this receiver will be the possibility of placing the mixer and heterodyne in the same cryostat, which will reduce the device dimensions considerably. The obtained experimental results are analyzed, and methods of optimizing the parameters of the receiver are proposed.

  18. The Pharmacogenomics of Anti-Hypertensive Therapy.

    PubMed

    Padmanabhan, Sandosh; Paul, Laura; Dominczak, Anna F

    2010-06-01

    Hypertension is a major public health problem, but measures to reduce blood pressure and thus cardiovascular risk are complicated by the high prevalence of treatment resistance, despite the availability of multiple drugs. Drug side-effects contribute considerably to suboptimal blood pressure control. Clinicians must often rely on empirical methods to match patients with effective drug treatment. Hypertension pharmacogenomics seeks to find genetic predictors of response to drugs that lower blood pressure and to translate this knowledge into clinical practice. In this review we summarise the current status of hypertension pharmacogenetics from monogenic hypertension to essential hypertension and discuss the issues that need to be considered in a hypertension pharmacogenomic study.

  19. Low-mass materials and vertex detector systems

    DOE PAGES

    Cooper, William E.

    2014-01-01

    Physics requirements set the material budget and the precision and stability necessary in low-mass vertex detector systems. Operational considerations, along with physics requirements, set the operating environment to be provided and determine the heat to be removed. Representative materials for fulfilling those requirements are described and properties of the materials are tabulated. A figure of merit is proposed to aid in material selection. Multi-layer structures are examined as a method to allow material to be used effectively, thereby reducing material contributions. Lastly, comments are made on future directions to be considered in using present materials effectively and in developing newmore » materials.« less

  20. Statistical studies in stellar rotation 2: A method of analyzing rotational coupling in double stars and an introduction to its applications

    NASA Technical Reports Server (NTRS)

    Bernacca, P. L.

    1971-01-01

    The correlation between the equatorial velocities of the components of double stars is studied from a statistical standpoint. A theory of rotational correlation is developed and discussed with regard to its applicability to existing observations. The theory is then applied to a sample of visual binaries which are the least studied for rotational coupling. Consideration of eclipsing systems and spectroscopic binaries is limited to show how the degrees of freedom in the spin parallelism problem can be reduced. The analysis lends support to the existence of synchronism in closely spaced binaries.

  1. Flow visualization techniques in the Airborne Laser Laboratory program

    NASA Technical Reports Server (NTRS)

    Walterick, R. E.; Vankuren, J. T.

    1980-01-01

    A turret/fairing assembly for laser applications was designed and tested. Wind tunnel testing was conducted using flow visualization techniques. The techniques used have included the methods of tufting, encapsulated liquid crystals, oil flow, sublimation and schlieren and shadowgraph photography. The results were directly applied to the design of fairing shapes for minimum drag and reduced turret buffet. In addition, the results are of primary importance to the study of light propagation paths in the near flow field of the turret cavity. Results indicate that the flow in the vicinity of the turret is an important factor for consideration in the design of suitable turret/fairing or aero-optic assemblies.

  2. A Protocol for Diagnosis and Management of Aortic Atherosclerosis in Cardiac Surgery Patients

    PubMed Central

    Brandon Bravo Bruinsma, George J.; Van 't Hof, Arnoud W. J.; Grandjean, Jan G.; Nierich, Arno P.

    2017-01-01

    In patients undergoing cardiac surgery, use of perioperative screening for aortic atherosclerosis with modified TEE (A-View method) was associated with lower postoperative mortality, but not stroke, as compared to patients operated on without such screening. At the time of clinical implementation and validation, we did not yet standardize the indications for modified TEE and the changes in patient management in the presence of aortic atherosclerosis. Therefore, we designed a protocol, which combined the diagnosis of atherosclerosis of thoracic aorta and the subsequent considerations with respect to the intraoperative management and provides a systematic approach to reduce the risk of cerebral complications. PMID:28852575

  3. Anesthetic Considerations in Robotic-Assisted Gynecologic Surgery

    PubMed Central

    Kaye, Alan D.; Vadivelu, Nalini; Ahuja, Nitin; Mitra, Sukanya; Silasi, Dan; Urman, Richard D.

    2013-01-01

    Background Robotic-assisted surgery has evolved over the past 2 decades with constantly improving technology that assists surgeons in multiple subspecialty disciplines. The surgical requirements of lithotomy and steep Trendelenburg positions, along with the creation of a pneumoperitoneum and lack of direct access to the patient all present management challenges in gynecologic surgery. Patient positioning requirements can have significant physiologic effects and can result in many complications. Methods This review focuses on the anesthetic and surgical implications of robot-assisted technology in gynecologic surgery. Conclusion Good communication among team members and knowledge of the nuances of robotic surgery have the potential to improve patient outcomes, increase efficiency, and reduce complications. PMID:24358000

  4. A Meta-Surface Antenna Array Decoupling (MAAD) Method for Mutual Coupling Reduction in a MIMO Antenna System.

    PubMed

    Wang, Ziyang; Zhao, Luyu; Cai, Yuanming; Zheng, Shufeng; Yin, Yingzeng

    2018-02-16

    In this paper, a method to reduce the inevitable mutual coupling between antennas in an extremely closely spaced two-element MIMO antenna array is proposed. A suspended meta-surface composed periodic square split ring resonators (SRRs) is placed above the antenna array for decoupling. The meta-surface is equivalent to a negative permeability medium, along which wave propagation is rejected. By properly designing the rejection frequency band of the SRR unit, the mutual coupling between the antenna elements in the MIMO antenna system can be significantly reduced. Two prototypes of microstrip antenna arrays at 5.8 GHz band with and without the metasurface have been fabricated and measured. The matching bandwidths of antennas with reflection coefficient smaller than -15 dB for the arrays without and with the metasurface are 360 MHz and 900 MHz respectively. Using the meta-surface, the isolation between elements is increased from around 8 dB to more than 27 dB within the band of interest. Meanwhile, the total efficiency and peak gain of each element, the envelope correlation coefficient (ECC) between the two elements are also improved by considerable amounts. All the results demonstrate that the proposed method is very efficient for enhancing the performance of MIMO antenna arrays.

  5. Rare Earth Extraction from NdFeB Magnet Using a Closed-Loop Acid Process.

    PubMed

    Kitagawa, Jiro; Uemura, Ryohei

    2017-08-14

    There is considerable interest in extraction of rare earth elements from NdFeB magnets to enable recycling of these elements. In practical extraction methods using wet processes, the acid waste solution discharge is a problem that must be resolved to reduce the environmental impact of the process. Here, we present an encouraging demonstration of rare earth element extraction from a NdFeB magnet using a closed-loop hydrochloric acid (HCl)-based process. The extraction method is based on corrosion of the magnet in a pretreatment stage and a subsequent ionic liquid technique for Fe extraction from the HCl solution. The rare earth elements are then precipitated using oxalic acid. Triple extraction has been conducted and the recovery ratio of the rare earth elements from the solution is approximately 50% for each extraction process, as compared to almost 100% recovery when using a one-shot extraction process without the ionic liquid but with sufficient oxalic acid. Despite its reduced extraction efficiency, the proposed method with its small number of procedures at almost room temperature is still highly advantageous in terms of both cost and environmental friendliness. This study represents an initial step towards realization of a closed-loop acid process for recycling of rare earth elements.

  6. Exact consideration of data redundancies for spiral cone-beam CT

    NASA Astrophysics Data System (ADS)

    Lauritsch, Guenter; Katsevich, Alexander; Hirsch, Michael

    2004-05-01

    In multi-slice spiral computed tomography (CT) there is an obvious trend in adding more and more detector rows. The goals are numerous: volume coverage, isotropic spatial resolution, and speed. Consequently, there will be a variety of scan protocols optimizing clinical applications. Flexibility in table feed requires consideration of data redundancies to ensure efficient detector usage. Until recently this was achieved by approximate reconstruction algorithms only. However, due to the increasing cone angles there is a need of exact treatment of the cone beam geometry. A new, exact and efficient 3-PI algorithm for considering three-fold data redundancies was derived from a general, theoretical framework based on 3D Radon inversion using Grangeat's formula. The 3-PI algorithm possesses a simple and efficient structure as the 1-PI method for non-redundant data previously proposed. Filtering is one-dimensional, performed along lines with variable tilt on the detector. This talk deals with a thorough evaluation of the performance of the 3-PI algorithm in comparison to the 1-PI method. Image quality of the 3-PI algorithm is superior. The prominent spiral artifacts and other discretization artifacts are significantly reduced due to averaging effects when taking into account redundant data. Certainly signal-to-noise ratio is increased. The computational expense is comparable even to that of approximate algorithms. The 3-PI algorithm proves its practicability for applications in medical imaging. Other exact n-PI methods for n-fold data redundancies (n odd) can be deduced from the general, theoretical framework.

  7. Optimization of the pepsin digestion method for anisakids inspection in the fishing industry.

    PubMed

    Llarena-Reino, María; Piñeiro, Carmen; Antonio, José; Outeriño, Luis; Vello, Carlos; González, Ángel F; Pascual, Santiago

    2013-01-31

    During the last 50 years human anisakiasis has been rising while parasites have increased their prevalence at determined fisheries becoming an emergent major public health problem. Although artificial enzymatic digestion procedure by CODEX (STAN 244-2004: standard for salted Atlantic herring and salted sprat) is the recommended protocol for anisakids inspection, no international agreement has been achieved in veterinary and scientific digestion protocols to regulate this growing source of biological hazard in fish products. The aim of this work was to optimize the current artificial digestion protocol by CODEX with the purpose of offering a faster, more useful and safer procedure for factories workers, than the current one for anisakids detection. To achieve these objectives, the existing pepsin chemicals and the conditions of the digestion method were evaluated and assayed in fresh and frozen samples, both in lean and fatty fish species. Results showed that the new digestion procedure considerably reduces the assay time, and it is more handy and efficient (the quantity of the resulting residue was considerably lower after less time) than the widely used CODEX procedure. In conclusion, the new digestion method herein proposed based on liquid pepsin format is an accurate reproducible and user-friendly off-site tool, that can be useful in the implementation of screening programs for the prevention of human anisakiasis (and associated gastroallergic disorders) due to the consumption of raw or undercooked contaminated seafood products. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. The role of urban municipal governments in reducing health inequities: A meta-narrative mapping analysis

    PubMed Central

    2010-01-01

    Background The 1986 Ottawa Charter for Health Promotion coincided with a preponderance of research, worldwide, on the social determinants of health and health inequities. Despite the establishment of a 'health inequities knowledge base', the precise roles for municipal governments in reducing health inequities at the local level remain poorly defined. The objective of this study was to monitor thematic trends in this knowledge base over time, and to track scholarly prescriptions for municipal government intervention on local health inequities. Methods Using meta-narrative mapping, four bodies of scholarly literature - 'health promotion', 'Healthy Cities', 'population health' and 'urban health' - that have made substantial contributions to the health inequities knowledge base were analyzed over the 1986-2006 timeframe. Article abstracts were retrieved from the four literature bodies using three electronic databases (PubMed, Sociological Abstracts, Web of Science), and coded for bibliographic characteristics, article themes and determinants of health profiles, and prescriptions for municipal government interventions on health inequities. Results 1004 journal abstracts pertaining to health inequities were analyzed. The overall quantity of abstracts increased considerably over the 20 year timeframe, and emerged primarily from the 'health promotion' and 'population health' literatures. 'Healthy lifestyles' and 'healthcare' were the most commonly emphasized themes in the abstracts. Only 17% of the abstracts articulated prescriptions for municipal government interventions on local health inequities. Such interventions included public health campaigns, partnering with other governments and non-governmental organizations for health interventions, and delivering effectively on existing responsibilities to improve health outcomes and reduce inequities. Abstracts originating from Europe, and from the 'Healthy Cities' and 'urban health' literatures, were most vocal regarding potential avenues for municipal government involvement on health inequities. Conclusions This study has demonstrated a pervasiveness of 'behavioural' and 'biomedical' perspectives, and a lack of consideration afforded to the roles and responsibilities of municipal governments, among the health inequities scholarly community. Thus, despite considerable research activity over the past two decades, the 'health inequities knowledge base' inadequately reflects the complex aetiology of, and solutions to, population health inequities. PMID:20500850

  9. Explanation of Change (EoC) Study: Considerations and Implementation Challenges

    NASA Technical Reports Server (NTRS)

    Bitten, Robert E.; Emmons, Debra L.; Hart, Matthew J.; Bordi, Francesco; Scolese, Christopher; Hinners, Noel

    2013-01-01

    This paper discusses the implementation of considerations resulting from a study investigating the cost change experienced by historical NASA science missions. The study investigated historical milestone and monthly status report documentation followed by interviews with key project personnel. The reasons for cost change were binned as being external to NASA, external to the project and internal to the project relative to the project's planning and execution. Based on the results of the binning process and the synthesis of project meetings and interviews, ten considerations were made with the objective to decrease the potential for cost change in future missions. Although no one magic bullet consideration was discovered, the considerations taken as a whole should help reduce cost and schedule change in future NASA missions.

  10. A STUDY OF THE EFFECT OF SUBSTITUENTS AND OF SOLVENT ON THE REACTIVITY OF THE NORMAL AND ABNORMAL POSITIONS OF UNSYMMETRICAL ORGANIC EPOXIDES

    DTIC Science & Technology

    determined by a kinetic study of the reactions of m-chloro- and 3,4-dimethylbenzylamine with styrene oxide in ethanol at 3 temperatures. The results...and o-methyl-styrene oxide with benzylamine in ethanol showed that the beta-methyl group reduces the rate of attack at both positions very...considerably, while the alpha-methyl group reduces the rate of normal attack slightly and that of abnormal attack considerably, and the o-methyl group has surprisingly little effect of the rate of attack at either position. (Author)

  11. Interphase layer optimization for metal matrix composites with fabrication considerations

    NASA Technical Reports Server (NTRS)

    Morel, M.; Saravanos, D. A.; Chamis, C. C.

    1991-01-01

    A methodology is presented to reduce the final matrix microstresses for metal matrix composites by concurrently optimizing the interphase characteristics and fabrication process. Application cases include interphase tailoring with and without fabrication considerations for two material systems, graphite/copper and silicon carbide/titanium. Results indicate that concurrent interphase/fabrication optimization produces significant reductions in the matrix residual stresses and strong coupling between interphase and fabrication tailoring. The interphase coefficient of thermal expansion and the fabrication consolidation pressure are the most important design parameters and must be concurrently optimized to further reduce the microstresses to more desirable magnitudes.

  12. Impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Li, Chao; Brissette, François P.; Chen, Hua; Wang, Mingna; Essou, Gilles R. C.

    2018-05-01

    Bias correction is usually implemented prior to using climate model outputs for impact studies. However, bias correction methods that are commonly used treat climate variables independently and often ignore inter-variable dependencies. The effects of ignoring such dependencies on impact studies need to be investigated. This study aims to assess the impacts of correcting the inter-variable correlation of climate model outputs on hydrological modeling. To this end, a joint bias correction (JBC) method which corrects the joint distribution of two variables as a whole is compared with an independent bias correction (IBC) method; this is considered in terms of correcting simulations of precipitation and temperature from 26 climate models for hydrological modeling over 12 watersheds located in various climate regimes. The results show that the simulated precipitation and temperature are considerably biased not only in the individual distributions, but also in their correlations, which in turn result in biased hydrological simulations. In addition to reducing the biases of the individual characteristics of precipitation and temperature, the JBC method can also reduce the bias in precipitation-temperature (P-T) correlations. In terms of hydrological modeling, the JBC method performs significantly better than the IBC method for 11 out of the 12 watersheds over the calibration period. For the validation period, the advantages of the JBC method are greatly reduced as the performance becomes dependent on the watershed, GCM and hydrological metric considered. For arid/tropical and snowfall-rainfall-mixed watersheds, JBC performs better than IBC. For snowfall- or rainfall-dominated watersheds, however, the two methods behave similarly, with IBC performing somewhat better than JBC. Overall, the results emphasize the advantages of correcting the P-T correlation when using climate model-simulated precipitation and temperature to assess the impact of climate change on watershed hydrology. However, a thorough validation and a comparison with other methods are recommended before using the JBC method, since it may perform worse than the IBC method for some cases due to bias nonstationarity of climate model outputs.

  13. Habitat Design Considerations for Implementing Solar Particle Event Radiation Protection

    NASA Technical Reports Server (NTRS)

    Simon, Mathew A.; Clowdsley, Martha S.; Walker, Steven A.

    2013-01-01

    Radiation protection is an important habitat design consideration for human exploration missions beyond Low Earth Orbit. Fortunately, radiation shelter concepts can effectively reduce astronaut exposure for the relatively low proton energies of solar particle events, enabling moderate duration missions of several months before astronaut exposure (galactic cosmic ray and solar particle event) approaches radiation exposure limits. In order to minimize habitat mass for increasingly challenging missions, design of radiation shelters must minimize dedicated, single-purpose shielding mass by leveraging the design and placement of habitat subsystems, accommodations, and consumables. NASA's Advanced Exploration Systems RadWorks Storm Shelter Team has recently designed and performed radiation analysis on several low dedicated mass shelter concepts for a year-long mission. This paper describes habitat design considerations identified during the study's radiation analysis. These considerations include placement of the shelter within a habitat for improved protection, integration of human factors guidance for sizing shelters, identification of potential opportunities for habitat subsystems to compromise on individual subsystem performances for overall vehicle mass reductions, and pre-configuration of shelter components for reduced deployment times.

  14. Unsupervised change detection of multispectral images based on spatial constraint chi-squared transform and Markov random field model

    NASA Astrophysics Data System (ADS)

    Shi, Aiye; Wang, Chao; Shen, Shaohong; Huang, Fengchen; Ma, Zhenli

    2016-10-01

    Chi-squared transform (CST), as a statistical method, can describe the difference degree between vectors. The CST-based methods operate directly on information stored in the difference image and are simple and effective methods for detecting changes in remotely sensed images that have been registered and aligned. However, the technique does not take spatial information into consideration, which leads to much noise in the result of change detection. An improved unsupervised change detection method is proposed based on spatial constraint CST (SCCST) in combination with a Markov random field (MRF) model. First, the mean and variance matrix of the difference image of bitemporal images are estimated by an iterative trimming method. In each iteration, spatial information is injected to reduce scattered changed points (also known as "salt and pepper" noise). To determine the key parameter confidence level in the SCCST method, a pseudotraining dataset is constructed to estimate the optimal value. Then, the result of SCCST, as an initial solution of change detection, is further improved by the MRF model. The experiments on simulated and real multitemporal and multispectral images indicate that the proposed method performs well in comprehensive indices compared with other methods.

  15. Joint image and motion reconstruction for PET using a B-spline motion model.

    PubMed

    Blume, Moritz; Navab, Nassir; Rafecas, Magdalena

    2012-12-21

    We present a novel joint image and motion reconstruction method for PET. The method is based on gated data and reconstructs an image together with a motion function. The motion function can be used to transform the reconstructed image to any of the input gates. All available events (from all gates) are used in the reconstruction. The presented method uses a B-spline motion model, together with a novel motion regularization procedure that does not need a regularization parameter (which is usually extremely difficult to adjust). Several image and motion grid levels are used in order to reduce the reconstruction time. In a simulation study, the presented method is compared to a recently proposed joint reconstruction method. While the presented method provides comparable reconstruction quality, it is much easier to use since no regularization parameter has to be chosen. Furthermore, since the B-spline discretization of the motion function depends on fewer parameters than a displacement field, the presented method is considerably faster and consumes less memory than its counterpart. The method is also applied to clinical data, for which a novel purely data-driven gating approach is presented.

  16. Comparing personal insight gains due to consideration of a recent dream and consideration of a recent event using the Ullman and Schredl dream group methods

    PubMed Central

    Edwards, Christopher L.; Malinowski, Josie E.; McGee, Shauna L.; Bennett, Paul D.; Ruby, Perrine M.; Blagrove, Mark T.

    2015-01-01

    There have been reports and claims in the psychotherapeutic literature that the consideration of recent dreams can result in personal realizations and insight. There is theoretical support for these claims from work on rapid eye movement (REM) sleep having a function of the consolidation of emotional memories and the creative formation of connections between new and older memories. To investigate these claims, 11 participants (10 females, one male) reported and considered a recent home dream in a dream discussion group that following the “Appreciating dreams” method of Montague Ullman. The group ran 11 times, each participant attending and participating once. A further nine participants (seven females, two males) reported and considered a recent home dream in a group that followed the “Listening to the dreamer” method of Michael Schredl. The two studies each had a control condition where the participant also reported a recent event, the consideration of which followed the same technique as was followed for the dream report. Outcomes of the discussions were assessed by the participants on the Gains from Dream Interpretation (GDI) scale, and on its counterpart, the Gains from Event Interpretation scale. High ratings on the GDI experiential-insight subscale were reported for both methods, when applied to dreams, and for the Ullman method Exploration-Insight ratings for the dream condition were significantly higher than for the control event condition. In the Ullman method, self-assessment of personal insight due to consideration of dream content was also significantly higher than for the event consideration condition. The findings support the view that benefits can be obtained from the consideration of dream content, in terms of identifying the waking life sources of dream content, and because personal insight may also occur. To investigate the mechanisms for the findings, the studies should be repeated with REM and non-REM dream reports, hypothesizing greater insight from the former. PMID:26150797

  17. [Soft tissue defects treated with perforator flaps].

    PubMed

    Weum, Sven; de Weerd, Louis; Klein, Steven; Hage, J Joris

    2008-01-31

    Treatment of soft tissue defects caused by trauma, tumour surgery or pressure sores is a challenge to the reconstructive surgeon. Although contour and function may be restored by tissue transposition, traditional methods often cause significant donor site morbidity. This article describes how increased understanding of vascular anatomy has led to the development of new techniques. The article is based on textbooks of plastic surgery, selected articles and own clinical experience. Pedicled and free perforator flaps represent the latest development in surgical treatment of soft tissue defects. The use of perforator flaps can considerably reduce the disadvantages that are associated with other surgical methods. The use of perforator flaps demands microsurgical skills, but has many advantages. Reliable vascular supply and a good aesthetical result can be combined with minimal donor site morbidity. In many cases this technique may even give sensibility to the reconstructed area.

  18. [Clinical application of iodine 123 with special consideration of radionuclide purity, measuring accuracy and radiation dose(author's dose)].

    PubMed

    Hermann, H J; Ammon, J; Winkel, K z; Haubold, U

    1975-05-01

    Iodine 123 is a nearly "ideal" radionuclide for thyroid imaging. The production of Iodine 123 requires cyclotrons or accelerators. The production of multicurie amounts of Iodine 123 has been suggested through the use of high-energy accelerators (less than 60 MeV). Most of the methods for the production of Iodine 123 using a compact cyclotron result in contamination with f.e. Iodine 124 which reduces the spatial resolution af imagining procedures and increases the radiation dose to the patient. The radiation dose has been calculated for three methods of production. The various contamination with Iodine 124, Iodine 125, and Iodine 126 result in comparable radiation dose of Iodine 131, provided that the time between production and application is more than four half-live-times of Iodine 123.

  19. Patterns and Sequences: Interactive Exploration of Clickstreams to Understand Common Visitor Paths.

    PubMed

    Liu, Zhicheng; Wang, Yang; Dontcheva, Mira; Hoffman, Matthew; Walker, Seth; Wilson, Alan

    2017-01-01

    Modern web clickstream data consists of long, high-dimensional sequences of multivariate events, making it difficult to analyze. Following the overarching principle that the visual interface should provide information about the dataset at multiple levels of granularity and allow users to easily navigate across these levels, we identify four levels of granularity in clickstream analysis: patterns, segments, sequences and events. We present an analytic pipeline consisting of three stages: pattern mining, pattern pruning and coordinated exploration between patterns and sequences. Based on this approach, we discuss properties of maximal sequential patterns, propose methods to reduce the number of patterns and describe design considerations for visualizing the extracted sequential patterns and the corresponding raw sequences. We demonstrate the viability of our approach through an analysis scenario and discuss the strengths and limitations of the methods based on user feedback.

  20. Decision making for best cogeneration power integration into a grid

    NASA Astrophysics Data System (ADS)

    Al Asmar, Joseph; Zakhia, Nadim; Kouta, Raed; Wack, Maxime

    2016-07-01

    Cogeneration systems are known to be efficient power systems for their ability to reduce pollution. Their integration into a grid requires simultaneous consideration of the economic and environmental challenges. Thus, an optimal cogeneration power are adopted to face such challenges. This work presents a novelty in selectinga suitable solution using heuristic optimization method. Its aim is to optimize the cogeneration capacity to be installed according to the economic and environmental concerns. This novelty is based on the sensitivity and data analysis method, namely, Multiple Linear Regression (MLR). This later establishes a compromise between power, economy, and pollution, which leads to find asuitable cogeneration power, and further, to be integrated into a grid. The data exploited were the results of the Genetic Algorithm (GA) multi-objective optimization. Moreover, the impact of the utility's subsidy on the selected power is shown.

  1. Statistical processing of large image sequences.

    PubMed

    Khellah, F; Fieguth, P; Murray, M J; Allen, M

    2005-01-01

    The dynamic estimation of large-scale stochastic image sequences, as frequently encountered in remote sensing, is important in a variety of scientific applications. However, the size of such images makes conventional dynamic estimation methods, for example, the Kalman and related filters, impractical. In this paper, we present an approach that emulates the Kalman filter, but with considerably reduced computational and storage requirements. Our approach is illustrated in the context of a 512 x 512 image sequence of ocean surface temperature. The static estimation step, the primary contribution here, uses a mixture of stationary models to accurately mimic the effect of a nonstationary prior, simplifying both computational complexity and modeling. Our approach provides an efficient, stable, positive-definite model which is consistent with the given correlation structure. Thus, the methods of this paper may find application in modeling and single-frame estimation.

  2. A hierarchy for modeling high speed propulsion systems

    NASA Technical Reports Server (NTRS)

    Hartley, Tom T.; Deabreu, Alex

    1991-01-01

    General research efforts on reduced order propulsion models for control systems design are overviewed. Methods for modeling high speed propulsion systems are discussed including internal flow propulsion systems that do not contain rotating machinery such as inlets, ramjets, and scramjets. The discussion is separated into four sections: (1) computational fluid dynamics model for the entire nonlinear system or high order nonlinear models; (2) high order linearized model derived from fundamental physics; (3) low order linear models obtained from other high order models; and (4) low order nonlinear models. Included are special considerations on any relevant control system designs. The methods discussed are for the quasi-one dimensional Euler equations of gasdynamic flow. The essential nonlinear features represented are large amplitude nonlinear waves, moving normal shocks, hammershocks, subsonic combustion via heat addition, temperature dependent gases, detonation, and thermal choking.

  3. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times.

    PubMed

    Yang, Xin; Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems.

  4. Bi-Objective Flexible Job-Shop Scheduling Problem Considering Energy Consumption under Stochastic Processing Times

    PubMed Central

    Zeng, Zhenxiang; Wang, Ruidong; Sun, Xueshan

    2016-01-01

    This paper presents a novel method on the optimization of bi-objective Flexible Job-shop Scheduling Problem (FJSP) under stochastic processing times. The robust counterpart model and the Non-dominated Sorting Genetic Algorithm II (NSGA-II) are used to solve the bi-objective FJSP with consideration of the completion time and the total energy consumption under stochastic processing times. The case study on GM Corporation verifies that the NSGA-II used in this paper is effective and has advantages to solve the proposed model comparing with HPSO and PSO+SA. The idea and method of the paper can be generalized widely in the manufacturing industry, because it can reduce the energy consumption of the energy-intensive manufacturing enterprise with less investment when the new approach is applied in existing systems. PMID:27907163

  5. Optimized hyperspectral band selection using hybrid genetic algorithm and gravitational search algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    The serious information redundancy in hyperspectral images (HIs) cannot contribute to the data analysis accuracy, instead it require expensive computational resources. Consequently, to identify the most useful and valuable information from the HIs, thereby improve the accuracy of data analysis, this paper proposed a novel hyperspectral band selection method using the hybrid genetic algorithm and gravitational search algorithm (GA-GSA). In the proposed method, the GA-GSA is mapped to the binary space at first. Then, the accuracy of the support vector machine (SVM) classifier and the number of selected spectral bands are utilized to measure the discriminative capability of the band subset. Finally, the band subset with the smallest number of spectral bands as well as covers the most useful and valuable information is obtained. To verify the effectiveness of the proposed method, studies conducted on an AVIRIS image against two recently proposed state-of-the-art GSA variants are presented. The experimental results revealed the superiority of the proposed method and indicated that the method can indeed considerably reduce data storage costs and efficiently identify the band subset with stable and high classification precision.

  6. Extraction and determination of arsenic species in leafy vegetables: Method development and application.

    PubMed

    Ma, Li; Yang, Zhaoguang; Kong, Qian; Wang, Lin

    2017-02-15

    Extraction of arsenic (As) species in leafy vegetables was investigated by different combinations of methods and extractants. The extracted As species were separated and determined by HPLC-ICP-MS method. The microwave assisted method using 1% HNO3 as the extractant exhibited satisfactory efficiency (>90%) at 90°C for 1.5h. The proposed method was applied for extracting As species from real leafy vegetables. Thirteen cultivars of leafy vegetables were collected and analyzed. The predominant species in all the investigated vegetable samples were As(III) and As(V). Moreover, both As(III) and As(V) concentrations were positive significant (p<0.01) correlated with total As (tAs) concentration. However, the percentage of As(V) reduced with tAs concentration increasing probably due to the conversion and transformation of As(V) to As(III) after uptake. The hazard quotient results indicated no particular risk to 94.6% of local consumers. Considerably carcinogenic risk by consumption of the leafy vegetables was observed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Laser plasmapheresis for treatment of pulmonary and pleural suppurative diseases

    NASA Astrophysics Data System (ADS)

    Akopov, A. L.; Bely, K. P.; Berezin, Yu. D.; Orlov, S. V.

    1996-04-01

    Plasmapheresis in pulmonology is one of the leading methods of extracorporeal detoxification of patients with purulent lung and pleura diseases. However this method causes a decrease in patients' leukocyte level and humoral immunity indices. The medico-biological research conducted at the State Scientific Center of Pulmonology demonstrated that a combination of plasmapheresis with additional irradiation of the patients' reinfused erythrocyte mass with low intensive He-Ne laser light with the wavelength of 0.63 mkm considerably reduces the probability of these complications. It may be due to the fact that laser irradiation of autologous erythrocytes induces local and general stimulation, favoring in this way the inflammatory process involution. The suggested method of plasmapheresis was used in the process of treatment of 76 patients with lung abscess, empyema, purulent mediastinitis, sepsis. The essence of the method consists in irradiating with He-Ne laser the last portion of the erythrocyte mass (130 - 170 ml), diluted with saline, during its reinfusion in the course of a routine plasmapheresis. The positive impact of practical application of the above method allows to characterize it as highly effective for treatment of purulent diseases in pulmonology.

  8. The opinions of occupational physicians about maintaining healthy workers by means of medical examinations in Japan using the Delphi method.

    PubMed

    Tateishi, Seiichiro; Watase, Mariko; Fujino, Yoshihisa; Mori, Koji

    2016-01-01

    In Japan, employee fitness for work is determined by annual medical examinations. It may be possible to reduce the variability in the results of work fitness determination, particularly for situation, if there is consensus among experts regarding consideration of limitation of work by means of a single parameter. Consensus building was attempted among 104 occupational physicians by employing a 3-round Delphi method. Among the medical examination parameters for which at least 50% of participants agreed in the 3rd round of the survey that the parameter would independently merit consideration for limitation of work, the values of the parameters proposed as criterion values that trigger consideration of limitation of work were sought. Parameters, along with their most frequently proposed criterion values, were defined in the study group meeting as parameters for which consensus was reached. Consensus was obtained for 8 parameters: systolic blood pressure 180 mmHg (86.6%), diastolic blood pressure 110 mmHg (85.9%), postprandial plasma glucose 300 mg/dl (76.9%), fasting plasma glucose 200 mg/dl (69.1%), Cre 2.0mg/dl (67.2%), HbA1c (JDS) 10% (62.3%), ALT 200 U/l (61.6%), and Hb 8 g/l (58.5%). To support physicians who give advice to employers about work-related measures based on the results of general medical examinations of employees, expert consensus information was obtained that can serve as background material for making judgements. It is expected that the use of this information will facilitate the ability to take appropriate measures after medical examination of employees.

  9. Validating the Operational Bias and Hypothesis of Universal Exponent in Landslide Frequency-Area Distribution

    PubMed Central

    Huang, Jr-Chuan; Lee, Tsung-Yu; Teng, Tse-Yang; Chen, Yi-Chin; Huang, Cho-Ying; Lee, Cheing-Tung

    2014-01-01

    The exponent decay in landslide frequency-area distribution is widely used for assessing the consequences of landslides and with some studies arguing that the slope of the exponent decay is universal and independent of mechanisms and environmental settings. However, the documented exponent slopes are diverse and hence data processing is hypothesized for this inconsistency. An elaborated statistical experiment and two actual landslide inventories were used here to demonstrate the influences of the data processing on the determination of the exponent. Seven categories with different landslide numbers were generated from the predefined inverse-gamma distribution and then analyzed by three data processing procedures (logarithmic binning, LB, normalized logarithmic binning, NLB and cumulative distribution function, CDF). Five different bin widths were also considered while applying LB and NLB. Following that, the maximum likelihood estimation was used to estimate the exponent slopes. The results showed that the exponents estimated by CDF were unbiased while LB and NLB performed poorly. Two binning-based methods led to considerable biases that increased with the increase of landslide number and bin width. The standard deviations of the estimated exponents were dependent not just on the landslide number but also on binning method and bin width. Both extremely few and plentiful landslide numbers reduced the confidence of the estimated exponents, which could be attributed to limited landslide numbers and considerable operational bias, respectively. The diverse documented exponents in literature should therefore be adjusted accordingly. Our study strongly suggests that the considerable bias due to data processing and the data quality should be constrained in order to advance the understanding of landslide processes. PMID:24852019

  10. Microfluidic Chip-Based Detection and Intraspecies Strain Discrimination of Salmonella Serovars Derived from Whole Blood of Septic Mice

    PubMed Central

    Patterson, Adriana S.; Heithoff, Douglas M.; Ferguson, Brian S.; Soh, H. Tom; Mahan, Michael J.

    2013-01-01

    Salmonella is a zoonotic pathogen that poses a considerable public health and economic burden in the United States and worldwide. Resultant human diseases range from enterocolitis to bacteremia to sepsis and are acutely dependent on the particular serovar of Salmonella enterica subsp. enterica, which comprises over 99% of human-pathogenic S. enterica isolates. Point-of-care methods for detection and strain discrimination of Salmonella serovars would thus have considerable benefit to medical, veterinary, and field applications that safeguard public health and reduce industry-associated losses. Here we describe a single, disposable microfluidic chip that supports isothermal amplification and sequence-specific detection and discrimination of Salmonella serovars derived from whole blood of septic mice. The integrated microfluidic electrochemical DNA (IMED) chip consists of an amplification chamber that supports loop-mediated isothermal amplification (LAMP), a rapid, single-temperature amplification method as an alternative to PCR that offers advantages in terms of sensitivity, reaction speed, and amplicon yield. The amplification chamber is connected via a microchannel to a detection chamber containing a reagentless, multiplexed (here biplex) sensing array for sequence-specific electrochemical DNA (E-DNA) detection of the LAMP products. Validation of the IMED device was assessed by the detection and discrimination of S. enterica subsp. enterica serovars Typhimurium and Choleraesuis, the causative agents of enterocolitis and sepsis in humans, respectively. IMED chips conferred rapid (under 2 h) detection and discrimination of these strains at clinically relevant levels (<1,000 CFU/ml) from whole, unprocessed blood collected from septic animals. The IMED-based chip assay shows considerable promise as a rapid, inexpensive, and portable point-of-care diagnostic platform for the detection and strain-specific discrimination of microbial pathogens. PMID:23354710

  11. Alternative Fuels Data Center: Idle Reduction Benefits and Considerations

    Science.gov Websites

    money, protects public health and the environment, and increases U.S. energy security. Reducing idle time can also reduce engine wear and associated maintenance costs. Saving Fuel and Money A photo of an

  12. Hydraulic elements in reduction of vibrations in mechanical systems

    NASA Astrophysics Data System (ADS)

    Białas, K.; Buchacz, A.

    2017-08-01

    This work presents non-classical method of design of mechanic systems with subsystem reducing vibrations. The purpose of this paper is also introduces synthesis of mechanic system with reducing vibrations understand as design of this type of systems. The synthesis may be applied to modify the already existing systems in order to achieve a desired result. Elements which reduce vibrations can be constructed with passive, semi-active or active components. These considerations systems have selected active items. A hallmark of active elements it is possible to change the parameters on time of these elements and their power from an external source. The implementation of active elements is very broad. These elements can be implemented through the use of components of electrical, pneumatic, hydraulic, etc. The system was consisted from mechanical and hydraulic elements. Hydraulic elements were used as subsystem reducing unwanted vibration of mechanical system. Hydraulic elements can be realized in the form of hydraulic cylinder. In the case of an active vibration reduction in the form of hydraulic cylinder it is very important to find the corresponding values of hydraulic components. The values of these elements affect the frequency of vibrations of this sub-system which is related to the effective vibration reduction [7,11].

  13. Selection of Instructional Methods and Techniques: The Basic Consideration of Teachers at Secondary School Level

    ERIC Educational Resources Information Center

    Ahmad, Saira Ijaz; Malik, Samina; Irum, Jamila; Zahid, Rabia

    2011-01-01

    The main objective of the study was to identify the instructional methods and techniques used by the secondary school teachers to transfer the instructions to the students and to explore the basic considerations of the teachers about the selection of these instructional methods and techniques. Participants of the study included were 442 teachers…

  14. Turtle: identifying frequent k-mers with cache-efficient algorithms.

    PubMed

    Roy, Rajat Shuvro; Bhattacharya, Debashish; Schliep, Alexander

    2014-07-15

    Counting the frequencies of k-mers in read libraries is often a first step in the analysis of high-throughput sequencing data. Infrequent k-mers are assumed to be a result of sequencing errors. The frequent k-mers constitute a reduced but error-free representation of the experiment, which can inform read error correction or serve as the input to de novo assembly methods. Ideally, the memory requirement for counting should be linear in the number of frequent k-mers and not in the, typically much larger, total number of k-mers in the read library. We present a novel method that balances time, space and accuracy requirements to efficiently extract frequent k-mers even for high-coverage libraries and large genomes such as human. Our method is designed to minimize cache misses in a cache-efficient manner by using a pattern-blocked Bloom filter to remove infrequent k-mers from consideration in combination with a novel sort-and-compact scheme, instead of a hash, for the actual counting. Although this increases theoretical complexity, the savings in cache misses reduce the empirical running times. A variant of method can resort to a counting Bloom filter for even larger savings in memory at the expense of false-negative rates in addition to the false-positive rates common to all Bloom filter-based approaches. A comparison with the state-of-the-art shows reduced memory requirements and running times. The tools are freely available for download at http://bioinformatics.rutgers.edu/Software/Turtle and http://figshare.com/articles/Turtle/791582. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Model-based coefficient method for calculation of N leaching from agricultural fields applied to small catchments and the effects of leaching reducing measures

    NASA Astrophysics Data System (ADS)

    Kyllmar, K.; Mårtensson, K.; Johnsson, H.

    2005-03-01

    A method to calculate N leaching from arable fields using model-calculated N leaching coefficients (NLCs) was developed. Using the process-based modelling system SOILNDB, leaching of N was simulated for four leaching regions in southern Sweden with 20-year climate series and a large number of randomised crop sequences based on regional agricultural statistics. To obtain N leaching coefficients, mean values of annual N leaching were calculated for each combination of main crop, following crop and fertilisation regime for each leaching region and soil type. The field-NLC method developed could be useful for following up water quality goals in e.g. small monitoring catchments, since it allows normal leaching from actual crop rotations and fertilisation to be determined regardless of the weather. The method was tested using field data from nine small intensively monitored agricultural catchments. The agreement between calculated field N leaching and measured N transport in catchment stream outlets, 19-47 and 8-38 kg ha -1 yr -1, respectively, was satisfactory in most catchments when contributions from land uses other than arable land and uncertainties in groundwater flows were considered. The possibility of calculating effects of crop combinations (crop and following crop) is of considerable value since changes in crop rotation constitute a large potential for reducing N leaching. When the effect of a number of potential measures to reduce N leaching (i.e. applying manure in spring instead of autumn; postponing ploughing-in of ley and green fallow in autumn; undersowing a catch crop in cereals and oilseeds; and increasing the area of catch crops by substituting winter cereals and winter oilseeds with corresponding spring crops) was calculated for the arable fields in the catchments using field-NLCs, N leaching was reduced by between 34 and 54% for the separate catchments when the best possible effect on the entire potential area was assumed.

  16. The efficacy of low-level laser treatment in reducing pain and swelling after endodontic surgery: a systematic review

    NASA Astrophysics Data System (ADS)

    Moshari, Amirabbas; Vatanpour, Mehdi; Zakershahrak, Mehrsa

    2016-03-01

    Introduction: LLLT in oral cavity believed to reduce pain after endodontic surgery and wisdom tooth removal, to accelerate wound healing and to have an anti-inflammatory and regenerative effect. The aim of this systematic review therefore was to assess the proof available for the efficacy of low-level laser treatment in reducing pain and swelling after endodontic surgery. Methods: The PubMed service of the U.S. National Library of Medicine was searched with applicable search strategies. No language restriction was applied. The last electronic search was accomplished on August 31, 2015. All randomized clinical trials on the efficiency of low-level laser treatment in reducing pain and swelling after endodontic surgery was considered for the Meta-analysis. Quality consideration of the included randomized clinical trials was appraised according to CONSORT guidelines. Results: Only two randomized clinical trials were attained. These studies clarified that laser treatment could reduce pain and swelling, but the results were not significant. Conclusions: Low-level laser therapy can be advantageous for the reduction of postoperative pain but there is no strong confirmation for its efficiency. Its clinical utility and applicability relating to endodontic surgery, Along with the optimal energy dosage and the number of laser treatments needed after surgery, still, demand further research and experiment.

  17. A Cross-Lingual Similarity Measure for Detecting Biomedical Term Translations

    PubMed Central

    Bollegala, Danushka; Kontonatsios, Georgios; Ananiadou, Sophia

    2015-01-01

    Bilingual dictionaries for technical terms such as biomedical terms are an important resource for machine translation systems as well as for humans who would like to understand a concept described in a foreign language. Often a biomedical term is first proposed in English and later it is manually translated to other languages. Despite the fact that there are large monolingual lexicons of biomedical terms, only a fraction of those term lexicons are translated to other languages. Manually compiling large-scale bilingual dictionaries for technical domains is a challenging task because it is difficult to find a sufficiently large number of bilingual experts. We propose a cross-lingual similarity measure for detecting most similar translation candidates for a biomedical term specified in one language (source) from another language (target). Specifically, a biomedical term in a language is represented using two types of features: (a) intrinsic features that consist of character n-grams extracted from the term under consideration, and (b) extrinsic features that consist of unigrams and bigrams extracted from the contextual windows surrounding the term under consideration. We propose a cross-lingual similarity measure using each of those feature types. First, to reduce the dimensionality of the feature space in each language, we propose prototype vector projection (PVP)—a non-negative lower-dimensional vector projection method. Second, we propose a method to learn a mapping between the feature spaces in the source and target language using partial least squares regression (PLSR). The proposed method requires only a small number of training instances to learn a cross-lingual similarity measure. The proposed PVP method outperforms popular dimensionality reduction methods such as the singular value decomposition (SVD) and non-negative matrix factorization (NMF) in a nearest neighbor prediction task. Moreover, our experimental results covering several language pairs such as English–French, English–Spanish, English–Greek, and English–Japanese show that the proposed method outperforms several other feature projection methods in biomedical term translation prediction tasks. PMID:26030738

  18. Predicting Physical Activity and Healthy Nutrition Behaviors Using Social Cognitive Theory: Cross-Sectional Survey among Undergraduate Students in Chongqing, China.

    PubMed

    Xu, Xianglong; Pu, Yang; Sharma, Manoj; Rao, Yunshuang; Cai, Yilin; Zhao, Yong

    2017-11-05

    (1) Background: Generally suggested public health measures to reduce obesity were to limit television (TV) viewing, enhance daily physical activities, enable the consumption of fruit and vegetables, and reduce sugar-sweetened beverage intake. This study analyzed the extent to which selected social cognitive theory constructs can predict these behaviors among Chinese undergraduate students. (2) Methods: This cross-sectional study included 1976 undergraduate students from six universities in Chongqing, China. A self-administered five-point Likert common physical activity and nutrition behavior scale based on social cognitive theory was utilized. (3) Results: This study included 687 (34.77%) males and 1289 (65.23%) females. A total of 60.14% of the students engaged in exercise for less than 30 min per day. Approximately 16.5%of the participants spent at least 4 h watching TV and sitting in front of a computer daily. Approximately 79% of the participants consumed less than five cups of fruit and vegetables daily. Undergraduate students who had high self-efficacy scores had more leisure time physical activities. Those who have high expectation scores had considerable time watching TV and sitting in front of a computer. Undergraduate students who had high expectation and self-efficacy scores had substantially low consumption of sugar-sweetened beverages. Those who had high self-efficacy scores consumed considerable amounts of fruit and vegetables. Furthermore, the type of university, BMI group, gender, age, lack of siblings, and grade level were associated with the aforementioned four behaviors. (4) Conclusion: Physical inactivity and unhealthy nutrition behaviors are common among undergraduate students. This study used social cognitive theory to provide several implications for limiting the TV viewing, enhancing daily physical activities, consuming fruit and vegetables, and reducing sugar-sweetened beverage intake among undergraduate students.

  19. Predicting Physical Activity and Healthy Nutrition Behaviors Using Social Cognitive Theory: Cross-Sectional Survey among Undergraduate Students in Chongqing, China

    PubMed Central

    Pu, Yang; Sharma, Manoj; Rao, Yunshuang; Cai, Yilin; Zhao, Yong

    2017-01-01

    (1) Background: Generally suggested public health measures to reduce obesity were to limit television (TV) viewing, enhance daily physical activities, enable the consumption of fruit and vegetables, and reduce sugar-sweetened beverage intake. This study analyzed the extent to which selected social cognitive theory constructs can predict these behaviors among Chinese undergraduate students. (2) Methods: This cross-sectional study included 1976 undergraduate students from six universities in Chongqing, China. A self-administered five-point Likert common physical activity and nutrition behavior scale based on social cognitive theory was utilized. (3) Results: This study included 687 (34.77%) males and 1289 (65.23%) females. A total of 60.14% of the students engaged in exercise for less than 30 min per day. Approximately 16.5% of the participants spent at least 4 h watching TV and sitting in front of a computer daily. Approximately 79% of the participants consumed less than five cups of fruit and vegetables daily. Undergraduate students who had high self-efficacy scores had more leisure time physical activities. Those who have high expectation scores had considerable time watching TV and sitting in front of a computer. Undergraduate students who had high expectation and self-efficacy scores had substantially low consumption of sugar-sweetened beverages. Those who had high self-efficacy scores consumed considerable amounts of fruit and vegetables. Furthermore, the type of university, BMI group, gender, age, lack of siblings, and grade level were associated with the aforementioned four behaviors. (4) Conclusion: Physical inactivity and unhealthy nutrition behaviors are common among undergraduate students. This study used social cognitive theory to provide several implications for limiting the TV viewing, enhancing daily physical activities, consuming fruit and vegetables, and reducing sugar-sweetened beverage intake among undergraduate students. PMID:29113089

  20. Electrochemical incineration of wastes

    NASA Technical Reports Server (NTRS)

    Bhardwaj, R. C.; Sharma, D. K.; Bockris, J. OM.

    1990-01-01

    The novel technology of waste removal in space vehicles by electrochemical methods is presented to convert wastes into chemicals that can be eventually recycled. The important consideration for waste oxidation is to select a right kind of electrode (anode) material that should be stable under anodic conditions and also a poor electrocatalyst for oxygen and chlorine evolution. On the basis of long term electrolysis experiments on seven different electrodes and on the basis of total organic carbon reduced, two best electrodes were identified. The effect of redox ions on the electrolyte was studied. Though most of the experiments were done in mixtures of urine and waste, the experiments with redox couples involved 2.5 M sulfuric acid in order to avoid the precipitation of redox ions by urea. Two methods for long term electrolysis of waste were investigated: (1) the oxidation on Pt and lead dioxide electrodes using the galvanostatic methods; and (2) potentiostatic method on other electrodes. The advantage of the first method is the faster rate of oxidation. The chlorine evolution in the second method is ten times less then in the first. The accomplished research has shown that urine/feces mixtures can be oxidized to carbon dioxide and water, but current densities are low and must be improved. The perovskite and Ti4O7 coated with RuO2 are the best electrode materials found. Recent experiment with the redox agent improves the current density, however, sulphuric acid is required to keep the redox agent in solution to enhance oxidation effectively. It is desirable to reduce the use of acid and/or find substitutes.

  1. Detecting and removing multiplicative spatial bias in high-throughput screening technologies.

    PubMed

    Caraus, Iurie; Mazoure, Bogdan; Nadon, Robert; Makarenkov, Vladimir

    2017-10-15

    Considerable attention has been paid recently to improve data quality in high-throughput screening (HTS) and high-content screening (HCS) technologies widely used in drug development and chemical toxicity research. However, several environmentally- and procedurally-induced spatial biases in experimental HTS and HCS screens decrease measurement accuracy, leading to increased numbers of false positives and false negatives in hit selection. Although effective bias correction methods and software have been developed over the past decades, almost all of these tools have been designed to reduce the effect of additive bias only. Here, we address the case of multiplicative spatial bias. We introduce three new statistical methods meant to reduce multiplicative spatial bias in screening technologies. We assess the performance of the methods with synthetic and real data affected by multiplicative spatial bias, including comparisons with current bias correction methods. We also describe a wider data correction protocol that integrates methods for removing both assay and plate-specific spatial biases, which can be either additive or multiplicative. The methods for removing multiplicative spatial bias and the data correction protocol are effective in detecting and cleaning experimental data generated by screening technologies. As our protocol is of a general nature, it can be used by researchers analyzing current or next-generation high-throughput screens. The AssayCorrector program, implemented in R, is available on CRAN. makarenkov.vladimir@uqam.ca. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  2. Improving the power of clinical trials of rheumatoid arthritis by using data on continuous scales when analysing response rates: an application of the augmented binary method

    PubMed Central

    Jenkins, Martin

    2016-01-01

    Objective. In clinical trials of RA, it is common to assess effectiveness using end points based upon dichotomized continuous measures of disease activity, which classify patients as responders or non-responders. Although dichotomization generally loses statistical power, there are good clinical reasons to use these end points; for example, to allow for patients receiving rescue therapy to be assigned as non-responders. We adopt a statistical technique called the augmented binary method to make better use of the information provided by these continuous measures and account for how close patients were to being responders. Methods. We adapted the augmented binary method for use in RA clinical trials. We used a previously published randomized controlled trial (Oral SyK Inhibition in Rheumatoid Arthritis-1) to assess its performance in comparison to a standard method treating patients purely as responders or non-responders. The power and error rate were investigated by sampling from this study. Results. The augmented binary method reached similar conclusions to standard analysis methods but was able to estimate the difference in response rates to a higher degree of precision. Results suggested that CI widths for ACR responder end points could be reduced by at least 15%, which could equate to reducing the sample size of a study by 29% to achieve the same statistical power. For other end points, the gain was even higher. Type I error rates were not inflated. Conclusion. The augmented binary method shows considerable promise for RA trials, making more efficient use of patient data whilst still reporting outcomes in terms of recognized response end points. PMID:27338084

  3. A three-step reconstruction method for fluorescence molecular tomography based on compressive sensing

    NASA Astrophysics Data System (ADS)

    Zhu, Yansong; Jha, Abhinav K.; Dreyer, Jakob K.; Le, Hanh N. D.; Kang, Jin U.; Roland, Per E.; Wong, Dean F.; Rahmim, Arman

    2017-02-01

    Fluorescence molecular tomography (FMT) is a promising tool for real time in vivo quantification of neurotransmission (NT) as we pursue in our BRAIN initiative effort. However, the acquired image data are noisy and the reconstruction problem is ill-posed. Further, while spatial sparsity of the NT effects could be exploited, traditional compressive-sensing methods cannot be directly applied as the system matrix in FMT is highly coherent. To overcome these issues, we propose and assess a three-step reconstruction method. First, truncated singular value decomposition is applied on the data to reduce matrix coherence. The resultant image data are input to a homotopy-based reconstruction strategy that exploits sparsity via l1 regularization. The reconstructed image is then input to a maximum-likelihood expectation maximization (MLEM) algorithm that retains the sparseness of the input estimate and improves upon the quantitation by accurate Poisson noise modeling. The proposed reconstruction method was evaluated in a three-dimensional simulated setup with fluorescent sources in a cuboidal scattering medium with optical properties simulating human brain cortex (reduced scattering coefficient: 9.2 cm-1, absorption coefficient: 0.1 cm-1 and tomographic measurements made using pixelated detectors. In different experiments, fluorescent sources of varying size and intensity were simulated. The proposed reconstruction method provided accurate estimates of the fluorescent source intensity, with a 20% lower root mean square error on average compared to the pure-homotopy method for all considered source intensities and sizes. Further, compared with conventional l2 regularized algorithm, overall, the proposed method reconstructed substantially more accurate fluorescence distribution. The proposed method shows considerable promise and will be tested using more realistic simulations and experimental setups.

  4. Global Obesity Study on Drivers for Weight Reduction Strategies

    PubMed Central

    Grebitus, Carola; Hartmann, Monika; Reynolds, Nikolai

    2015-01-01

    Objective To assess factors determining the reaction of individuals to the threats of overweight and obesity and to examine the interdependencies between weight-reducing strategies. Methods Cross-country survey covering 19 countries and 13,155 interviews. Data were analysed using a bivariate probit model that allows simultaneously analysing two weight-reducing strategies. Results Results show that weight-reducing strategies chosen are not independent from each other. Findings also reveal that different strategies are chosen by different population segments. Women are more likely to change their dietary patterns and less likely to become physically active after surpassing a weight threshold. In addition, the probability of a dietary change in case of overweight differs considerably between countries. The study also reveals that attitudes are an important factor for the strategy choice. Conclusions It is vital for public health policies to understand determinants of citizens’ engagement in weight reduction strategies once a certain threshold is reached. Thus, results can support the design of public health campaigns and programmes that aim to change community or national health behaviour trends taking into account, e.g., national differences. PMID:25765165

  5. Effect of Various Sodium Chloride Mass Fractions on Wheat and Rye Bread Using Different Dough Preparation Techniques

    PubMed Central

    Tańska, Małgorzata; Rotkiewicz, Daniela; Piętak, Andrzej

    2016-01-01

    Summary This study assessed the selected properties of bread with reduced amount of sodium chloride. The bread was made from white and wholemeal wheat flour and rye flour. The dough was prepared using three techniques: with yeast, natural sourdough or starter sourdough. Sodium chloride was added to the dough at 0, 0.5, 1.0 and 1.5% of the flour mass. The following bread properties were examined in the study: yield and volume of the loaf, moisture content, crumb firmness and porosity, and organoleptic properties. Reducing the mass fraction of added sodium chloride was not found to have considerable effect on bread yield, whereas it had a significant and variable effect on the loaf volume, and crumb firmness and porosity. Organoleptic assessment showed diverse effects of sodium chloride addition on sensory properties of bread, depending on the type of bread and the dough preparation method. Reduced mass fractions of sodium chloride changed the organoleptic properties of bread made with yeast and with starter sourdough to a greater extent than of bread prepared with natural sourdough. PMID:27904407

  6. Gamma irradiation reduces the immunological toxicity of doxorubicin, anticancer drug

    NASA Astrophysics Data System (ADS)

    Kim, Jae-Hun; Sung, Nak-Yun; Raghavendran, H. Balaji; Yoon, Yohan; Song, Beom-Seok; Choi, Jong-il; Yoo, Young-Choon; Byun, Myung-Woo; Hwang, Young-Jeong; Lee, Ju-Woon

    2009-07-01

    Doxorubicin (DOX) is a widely used anticancer agent, but exhibits some immunological toxicity to patients during chemotherapy. The present study was conducted to evaluate the effect of gamma irradiation on the immunological response and the inhibition activity on in vivo tumor mass of DOX. The results showed that DOX irradiated at 10 and 20 kGy reduce the inhibition of mouse peritoneal macrophage proliferation and induce the release of cytokines (TNF-α and IL-6) when compared with non-irradiated DOX. The cytotoxicity against human breast (MCF-7), murine colon adenocarcinoma (Colon 26) and human monocytic (THP-1) tumor cell were not significantly different between non-irradiated and irradiated DOX ( P<0.05). In vivo study on the tumor mass inhibition, gamma-irradiated DOX showed a considerable inhibition of tumor mass and this effect was statistically non-significant as compared with non-irradiated DOX. In conclusion, gamma irradiation could be regarded as a potential method for reducing the immunological toxicity of DOX. Further researches is needed to reveal the formation and activity of radiolysis products by gamma irradiation.

  7. 26 CFR 20.2043-1 - Transfers for insufficient consideration.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., rights or powers enumerated and described in sections 2035 through 2038 and section 2041 are not subject... been made in good faith, and the price must have been an adequate and full equivalent reducible to a money value. If the price was less than such a consideration, only the excess of the fair market value...

  8. The effectiveness of an air cleaner in controlling droplet/aerosol particle dispersion emitted from a patient's mouth in the indoor environment of dental clinics.

    PubMed

    Chen, Chun; Zhao, Bin; Cui, Weilin; Dong, Lei; An, Na; Ouyang, Xiangying

    2010-07-06

    Dental healthcare workers (DHCWs) are at high risk of occupational exposure to droplets and aerosol particles emitted from patients' mouths during treatment. We evaluated the effectiveness of an air cleaner in reducing droplet and aerosol contamination by positioning the device in four different locations in an actual dental clinic. We applied computational fluid dynamics (CFD) methods to solve the governing equations of airflow, energy and dispersion of different-sized airborne droplets/aerosol particles. In a dental clinic, we measured the supply air velocity and temperature of the ventilation system, the airflow rate and the particle removal efficiency of the air cleaner to determine the boundary conditions for the CFD simulations. Our results indicate that use of an air cleaner in a dental clinic may be an effective method for reducing DHCWs' exposure to airborne droplets and aerosol particles. Further, we found that the probability of droplet/aerosol particle removal and the direction of airflow from the cleaner are both important control measures for droplet and aerosol contamination in a dental clinic. Thus, the distance between the air cleaner and droplet/aerosol particle source as well as the relative location of the air cleaner to both the source and the DHCW are important considerations for reducing DHCWs' exposure to droplets/aerosol particles emitted from the patient's mouth during treatments.

  9. The effectiveness of an air cleaner in controlling droplet/aerosol particle dispersion emitted from a patient's mouth in the indoor environment of dental clinics

    PubMed Central

    Chen, Chun; Zhao, Bin; Cui, Weilin; Dong, Lei; An, Na; Ouyang, Xiangying

    2010-01-01

    Dental healthcare workers (DHCWs) are at high risk of occupational exposure to droplets and aerosol particles emitted from patients' mouths during treatment. We evaluated the effectiveness of an air cleaner in reducing droplet and aerosol contamination by positioning the device in four different locations in an actual dental clinic. We applied computational fluid dynamics (CFD) methods to solve the governing equations of airflow, energy and dispersion of different-sized airborne droplets/aerosol particles. In a dental clinic, we measured the supply air velocity and temperature of the ventilation system, the airflow rate and the particle removal efficiency of the air cleaner to determine the boundary conditions for the CFD simulations. Our results indicate that use of an air cleaner in a dental clinic may be an effective method for reducing DHCWs' exposure to airborne droplets and aerosol particles. Further, we found that the probability of droplet/aerosol particle removal and the direction of airflow from the cleaner are both important control measures for droplet and aerosol contamination in a dental clinic. Thus, the distance between the air cleaner and droplet/aerosol particle source as well as the relative location of the air cleaner to both the source and the DHCW are important considerations for reducing DHCWs' exposure to droplets/aerosol particles emitted from the patient's mouth during treatments. PMID:20031985

  10. Lab-scale study on the application of In-Adit-Sulfate-Reducing System for AMD control.

    PubMed

    Ji, S W; Kim, S J

    2008-12-30

    In a study of the 29 operating passive systems for acid mine drainage (AMD) treatment, 19 systems showed various performance problems. Some systems showed very low efficiency even without visible leakage or overflow. Though systems show fairly good efficiency in metal removal (mainly iron) and pH control, sulfate removal rates were very low which indicates the possibility of very poor sulfate reductions by Sulfate Reducing Bacteria (SRB). As an alternative method, In-Adit-Sulfate-Reducing System (IASRS), the method of placing the SAPS inside the adit, to have temperature constant at about 15 degrees C, was suggested. Lab-scale model experiments of IASRS were carried out. The models 1 and 2 were run at 15 degrees C and 25 degrees C, respectively. The model 1 contained about a half of COD in the beginning of the operation than that of model 2. Metal removal ratios were higher than 90% in both systems. Both systems showed the sulfate removal ratios of 23% and 27%, respectively, which were still considerably low, even though higher than those of presently operating systems. However, since the synthetic AMD used was very low in pH (2.8) and very high in sulfate concentration, if some suggested modifications were applied to the standard design, it is presumed that the sulfate removal ratio would have increased.

  11. Containment wells to form hydraulic barriers along site boundaries.

    PubMed

    Vo, D; Ramamurthy, A S; Qu, J; Zhao, X P

    2008-12-15

    In the field, aquifer remediation methods include pump and treat procedures based on hydraulic control systems. They are used to reduce the level of residual contamination present in the soil and soil pores of aquifers. Often, physical barriers are erected along the boundaries of the target (aquifer) site to reduce the leakage of the released soil contaminant to the surrounding regions. Physical barriers are expensive to build and dismantle. Alternatively, based on simple hydraulic principles, containment wells or image wells injecting clear water can be designed and built to provide hydraulic barriers along the contaminated site boundaries. For brevity, only one pattern of containment well system that is very effective is presented in detail. The study briefly reports about the method of erecting a hydraulic barrier around a contaminated region based on the simple hydraulic principle of images. During the clean-up period, hydraulic barriers can considerably reduce the leakage of the released contaminant from the target site to surrounding pristine regions. Containment wells facilitate the formation of hydraulic barriers. Hence, they control the movement of contaminants away from the site that is being remedied. However, these wells come into play, only when the pumping operation for cleaning up the site is active. After operation, they can be filled with soil to permit the natural ground water movement. They can also be used as monitoring wells.

  12. Vibration transfer mobility measurements using maximum length sequences

    NASA Astrophysics Data System (ADS)

    Singleton, Herbert L.

    2005-09-01

    Vibration transfer mobility measurements are required under Federal Transit Administration guidelines when developing detailed predictions of ground-borne vibration for rail transit systems. These measurements typically use a large instrumented hammer to generate impulses in the soil. These impulses are measured by an array of accelerometers to characterize the transfer mobility of the ground in a localized area. While effective, these measurements often make use of heavy, custom-engineered equipment to produce the impulse signal. To obtain satisfactory signal-to-noise ratios, it is necessary to generate multiple impulses to generate an average value, but this process involves considerable physical labor in the field. To address these shortcomings, a transfer mobility measurement system utilizing a tactile transducer and maximum length sequences (MLS) was developed. This system uses lightweight off-the-shelf components to significantly reduce the weight and cost of the system. The use of MLS allows for adequate signal-to-noise ratio from the tactile transducer, while minimizing the length of the measurement. Tests of the MLS system show good agreement with the impulse-based method. The combination of the cost savings and reduced weight of this new system facilitates transfer mobility measurements that are less physically demanding, and more economical when compared with current methods.

  13. Measuring sunscreen protection against solar-simulated radiation-induced structural radical damage to skin using ESR/spin trapping: development of an ex vivo test method.

    PubMed

    Haywood, Rachel; Volkov, Arsen; Andrady, Carima; Sayer, Robert

    2012-03-01

    The in vitro star system used for sunscreen UVA-testing is not an absolute measure of skin protection being a ratio of the total integrated UVA/UVB absorption. The in vivo persistent-pigment-darkening method requires human volunteers. We investigated the use of the ESR-detectable DMPO protein radical-adduct in solar-simulator-irradiated skin substitutes for sunscreen testing. Sunscreens SPF rated 20+ with UVA protection, reduced this adduct by 40-65% when applied at 2 mg/cm(2). SPF 15 Organic UVA-UVB (BMDBM-OMC) and TiO(2)-UVB filters and a novel UVA-TiO(2) filter reduced it by 21, 31 and 70% respectively. Conventional broad-spectrum sunscreens do not fully protect against protein radical-damage in skin due to possible visible-light contributions to damage or UVA-filter degradation. Anisotropic spectra of DMPO-trapped oxygen-centred radicals, proposed intermediates of lipid-oxidation, were detected in irradiated sunscreen and DMPO. Sunscreen protection might be improved by the consideration of visible-light protection and the design of filters to minimise radical leakage and lipid-oxidation.

  14. Evaluations of catalysts for wet oxidation waste management in CELSS

    NASA Astrophysics Data System (ADS)

    Oguchi, Mitsuo; Nitta, Keiji

    1992-11-01

    A wet oxidation method is considered to be one of the most effective methods of waste processing and recycling in CELSS (Controlled Ecological Life Support System). The first test using rabbit waste as raw material was conducted under a decomposition temperature of 280 °C for 30 minutes and an initial pure oxygen pressure of 4.9 MPa (50 kgf/cm2) before heating, and the following results were obtained. The value of COD (Chemical Oxygen Demand) was reduced 82.5 % by the wet oxidation. And also the Kjeldahl nitrogen concentration was decreased 98.8%. However, the organic carbon compound in the residual solution was almost acetic acid and ammonia was produced. In order to activate the oxidation more strongly, the second tests using catalysts such as Pd, Ru and Ru+Rh were conducted. As the results of these tests, the effectiveness of catalysts for oxidizing raw material ws shown as follows: COD and the Kjeldahl nitrogen values were drastically decreased 99.65 % and 99.88 %, respectively. Furthermore, the quantity of acetic acid and ammonia were reduced considerably. On the other hand, nitrate was showed a value 30 times as much as without catalytic oxidation.

  15. Optimizing Eco-Efficiency Across the Procurement Portfolio.

    PubMed

    Pelton, Rylie E O; Li, Mo; Smith, Timothy M; Lyon, Thomas P

    2016-06-07

    Manufacturing organizations' environmental impacts are often attributable to processes in the firm's upstream supply chain. Environmentally preferable procurement (EPP) and the establishment of environmental purchasing criteria can potentially reduce these indirect impacts. Life-cycle assessment (LCA) can help identify the purchasing criteria that are most effective in reducing environmental impacts. However, the high costs of LCA and the problems associated with the comparability of results have limited efforts to integrate procurement performance with quantitative organizational environmental performance targets. Moreover, environmental purchasing criteria, when implemented, are often established on a product-by-product basis without consideration of other products in the procurement portfolio. We develop an approach that utilizes streamlined LCA methods, together with linear programming, to determine optimal portfolios of product impact-reduction opportunities under budget constraints. The approach is illustrated through a simulated breakfast cereal manufacturing firm procuring grain, containerboard boxes, plastic packaging, electricity, and industrial cleaning solutions. Results suggest that extending EPP decisions and resources to the portfolio level, recently made feasible through the methods illustrated herein, can provide substantially greater CO2e and water-depletion reductions per dollar spend than a product-by-product approach, creating opportunities for procurement organizations to participate in firm-wide environmental impact reduction targets.

  16. Characterizing model uncertainties in the life cycle of lignocellulose-based ethanol fuels.

    PubMed

    Spatari, Sabrina; MacLean, Heather L

    2010-11-15

    Renewable and low carbon fuel standards being developed at federal and state levels require an estimation of the life cycle carbon intensity (LCCI) of candidate fuels that can substitute for gasoline, such as second generation bioethanol. Estimating the LCCI of such fuels with a high degree of confidence requires the use of probabilistic methods to account for known sources of uncertainty. We construct life cycle models for the bioconversion of agricultural residue (corn stover) and energy crops (switchgrass) and explicitly examine uncertainty using Monte Carlo simulation. Using statistical methods to identify significant model variables from public data sets and Aspen Plus chemical process models,we estimate stochastic life cycle greenhouse gas (GHG) emissions for the two feedstocks combined with two promising fuel conversion technologies. The approach can be generalized to other biofuel systems. Our results show potentially high and uncertain GHG emissions for switchgrass-ethanol due to uncertain CO₂ flux from land use change and N₂O flux from N fertilizer. However, corn stover-ethanol,with its low-in-magnitude, tight-in-spread LCCI distribution, shows considerable promise for reducing life cycle GHG emissions relative to gasoline and corn-ethanol. Coproducts are important for reducing the LCCI of all ethanol fuels we examine.

  17. Consideration of reinforcement mechanism in the short fiber mixing granular materials by granular element simulations

    NASA Astrophysics Data System (ADS)

    Mori, Kentaro; Kaneko, Kenji; Hashizume, Yutaka

    2017-06-01

    The short fiber mixing method is well known as one of the method to improve the strength of gran- ular soils in geotechnical engineering. Mechanical properties of the short fiber mixing granular materials are influenced by many factors, such as the mixture ratio of the short fiber, the material of short fiber, the length, and the orientation. In particular, the mixture ratio of the short fibers is very important in mixture design. In the past study, we understood that the strength is reduced by too much short fiber mixing by a series of tri-axial compression experiments. Namely, there is "optimum mixture ratio" in the short fiber mixing granular soils. In this study, to consider the mechanism of occurrence of the optimum mixture ratio, we carried out the numerical experiments by granular element method. As the results, we can understand that the strength decrease when too much grain-fiber contact points exist, because a friction coefficient is smaller than the grain-grain contact points.

  18. Intelligent Scheduling for Underground Mobile Mining Equipment.

    PubMed

    Song, Zhen; Schunnesson, Håkan; Rinne, Mikael; Sturgul, John

    2015-01-01

    Many studies have been carried out and many commercial software applications have been developed to improve the performances of surface mining operations, especially for the loader-trucks cycle of surface mining. However, there have been quite few studies aiming to improve the mining process of underground mines. In underground mines, mobile mining equipment is mostly scheduled instinctively, without theoretical support for these decisions. Furthermore, in case of unexpected events, it is hard for miners to rapidly find solutions to reschedule and to adapt the changes. This investigation first introduces the motivation, the technical background, and then the objective of the study. A decision support instrument (i.e. schedule optimizer for mobile mining equipment) is proposed and described to address this issue. The method and related algorithms which are used in this instrument are presented and discussed. The proposed method was tested by using a real case of Kittilä mine located in Finland. The result suggests that the proposed method can considerably improve the working efficiency and reduce the working time of the underground mine.

  19. New Trends in Pesticide Residue Analysis in Cereals, Nutraceuticals, Baby Foods, and Related Processed Consumer Products.

    PubMed

    Raina-Fulton, Renata

    2015-01-01

    Pesticide residue methods have been developed for a wide variety of food products including cereal-based foods, nutraceuticals and related plant products, and baby foods. These cereal, fruit, vegetable, and plant-based products provide the basis for many processed consumer products. For cereal and nutraceuticals, which are dry sample products, a modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) method has been used with additional steps to allow wetting of the dry sample matrix and subsequent cleanup using dispersive or cartridge format SPE to reduce matrix effects. More processed foods may have lower pesticide concentrations but higher co-extracts that can lead to signal suppression or enhancement with MS detection. For complex matrixes, GC/MS/MS or LC/electrospray ionization (positive or negative ion)-MS/MS is more frequently used. The extraction and cleanup methods vary with different sample types particularly for cereal-based products, and these different approaches are discussed in this review. General instrument considerations are also discussed.

  20. Lipidic cubic phase injector is a viable crystal delivery system for time-resolved serial crystallography

    DOE PAGES

    Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; ...

    2016-08-22

    Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less

  1. Theoretical prediction of welding distortion in large and complex structures

    NASA Astrophysics Data System (ADS)

    Deng, De-An

    2010-06-01

    Welding technology is widely used to assemble large thin plate structures such as ships, automobiles, and passenger trains because of its high productivity. However, it is impossible to avoid welding-induced distortion during the assembly process. Welding distortion not only reduces the fabrication accuracy of a weldment, but also decreases the productivity due to correction work. If welding distortion can be predicted using a practical method beforehand, the prediction will be useful for taking appropriate measures to control the dimensional accuracy to an acceptable limit. In this study, a two-step computational approach, which is a combination of a thermoelastic-plastic finite element method (FEM) and an elastic finite element with consideration for large deformation, is developed to estimate welding distortion for large and complex welded structures. Welding distortions in several representative large complex structures, which are often used in shipbuilding, are simulated using the proposed method. By comparing the predictions and the measurements, the effectiveness of the two-step computational approach is verified.

  2. OSPAR standard method and software for statistical analysis of beach litter data.

    PubMed

    Schulz, Marcus; van Loon, Willem; Fleet, David M; Baggelaar, Paul; van der Meulen, Eit

    2017-09-15

    The aim of this study is to develop standard statistical methods and software for the analysis of beach litter data. The optimal ensemble of statistical methods comprises the Mann-Kendall trend test, the Theil-Sen slope estimation, the Wilcoxon step trend test and basic descriptive statistics. The application of Litter Analyst, a tailor-made software for analysing the results of beach litter surveys, to OSPAR beach litter data from seven beaches bordering on the south-eastern North Sea, revealed 23 significant trends in the abundances of beach litter types for the period 2009-2014. Litter Analyst revealed a large variation in the abundance of litter types between beaches. To reduce the effects of spatial variation, trend analysis of beach litter data can most effectively be performed at the beach or national level. Spatial aggregation of beach litter data within a region is possible, but resulted in a considerable reduction in the number of significant trends. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Synchronization Design and Error Analysis of Near-Infrared Cameras in Surgical Navigation.

    PubMed

    Cai, Ken; Yang, Rongqian; Chen, Huazhou; Huang, Yizhou; Wen, Xiaoyan; Huang, Wenhua; Ou, Shanxing

    2016-01-01

    The accuracy of optical tracking systems is important to scientists. With the improvements reported in this regard, such systems have been applied to an increasing number of operations. To enhance the accuracy of these systems further and to reduce the effect of synchronization and visual field errors, this study introduces a field-programmable gate array (FPGA)-based synchronization control method, a method for measuring synchronous errors, and an error distribution map in field of view. Synchronization control maximizes the parallel processing capability of FPGA, and synchronous error measurement can effectively detect the errors caused by synchronization in an optical tracking system. The distribution of positioning errors can be detected in field of view through the aforementioned error distribution map. Therefore, doctors can perform surgeries in areas with few positioning errors, and the accuracy of optical tracking systems is considerably improved. The system is analyzed and validated in this study through experiments that involve the proposed methods, which can eliminate positioning errors attributed to asynchronous cameras and different fields of view.

  4. Boosting quantum annealer performance via sample persistence

    NASA Astrophysics Data System (ADS)

    Karimi, Hamed; Rosenberg, Gili

    2017-07-01

    We propose a novel method for reducing the number of variables in quadratic unconstrained binary optimization problems, using a quantum annealer (or any sampler) to fix the value of a large portion of the variables to values that have a high probability of being optimal. The resulting problems are usually much easier for the quantum annealer to solve, due to their being smaller and consisting of disconnected components. This approach significantly increases the success rate and number of observations of the best known energy value in samples obtained from the quantum annealer, when compared with calling the quantum annealer without using it, even when using fewer annealing cycles. Use of the method results in a considerable improvement in success metrics even for problems with high-precision couplers and biases, which are more challenging for the quantum annealer to solve. The results are further enhanced by applying the method iteratively and combining it with classical pre-processing. We present results for both Chimera graph-structured problems and embedded problems from a real-world application.

  5. Lipidic cubic phase injector is a viable crystal delivery system for time-resolved serial crystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett

    Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less

  6. Antioxidant activity, total phenolics and flavonoids contents: Should we ban in vitro screening methods?

    PubMed

    Granato, Daniel; Shahidi, Fereidoon; Wrolstad, Ronald; Kilmartin, Paul; Melton, Laurence D; Hidalgo, Francisco J; Miyashita, Kazuo; Camp, John van; Alasalvar, Cesarettin; Ismail, Amin B; Elmore, Stephen; Birch, Gordon G; Charalampopoulos, Dimitris; Astley, Sian B; Pegg, Ronald; Zhou, Peng; Finglas, Paul

    2018-10-30

    As many studies are exploring the association between ingestion of bioactive compounds and decreased risk of non-communicable diseases, the scientific community continues to show considerable interest in these compounds. In addition, as many non-nutrients with putative health benefits are reducing agents, hydrogen donors, singlet oxygen quenchers or metal chelators, measurement of antioxidant activity using in vitro assays has become very popular over recent decades. Measuring concentrations of total phenolics, flavonoids, and other compound (sub)classes using UV/Vis spectrophotometry offers a rapid chemical index, but chromatographic techniques are necessary to establish structure-activity. For bioactive purposes, in vivo models are required or, at the very least, methods that employ distinct mechanisms of action (i.e., single electron transfer, transition metal chelating ability, and hydrogen atom transfer). In this regard, better understanding and application of in vitro screening methods should help design of future research studies on 'bioactive compounds'. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Lipidic cubic phase injector is a viable crystal delivery system for time-resolved serial crystallography

    PubMed Central

    Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; Gati, Cornelius; Kimura, Tetsunari; Milne, Christopher; Milathianaki, Despina; Kubo, Minoru; Wu, Wenting; Conrad, Chelsie; Coe, Jesse; Bean, Richard; Zhao, Yun; Båth, Petra; Dods, Robert; Harimoorthy, Rajiv; Beyerlein, Kenneth R.; Rheinberger, Jan; James, Daniel; DePonte, Daniel; Li, Chufeng; Sala, Leonardo; Williams, Garth J.; Hunter, Mark S.; Koglin, Jason E.; Berntsen, Peter; Nango, Eriko; Iwata, So; Chapman, Henry N.; Fromme, Petra; Frank, Matthias; Abela, Rafael; Boutet, Sébastien; Barty, Anton; White, Thomas A.; Weierstall, Uwe; Spence, John; Neutze, Richard; Schertler, Gebhard; Standfuss, Jörg

    2016-01-01

    Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within the crystal lattice is confirmed by time-resolved visible absorption spectroscopy. This study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX. PMID:27545823

  8. Bayes and empirical Bayes methods for reduced rank regression models in matched case-control studies.

    PubMed

    Satagopan, Jaya M; Sen, Ananda; Zhou, Qin; Lan, Qing; Rothman, Nathaniel; Langseth, Hilde; Engel, Lawrence S

    2016-06-01

    Matched case-control studies are popular designs used in epidemiology for assessing the effects of exposures on binary traits. Modern studies increasingly enjoy the ability to examine a large number of exposures in a comprehensive manner. However, several risk factors often tend to be related in a nontrivial way, undermining efforts to identify the risk factors using standard analytic methods due to inflated type-I errors and possible masking of effects. Epidemiologists often use data reduction techniques by grouping the prognostic factors using a thematic approach, with themes deriving from biological considerations. We propose shrinkage-type estimators based on Bayesian penalization methods to estimate the effects of the risk factors using these themes. The properties of the estimators are examined using extensive simulations. The methodology is illustrated using data from a matched case-control study of polychlorinated biphenyls in relation to the etiology of non-Hodgkin's lymphoma. © 2015, The International Biometric Society.

  9. POSSIBLE APPLICATIONS OF IONIZING RADIATIONS IN THE FRUIT, VEGETABLE AND RELATED INDUSTRIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, I.D.

    1959-10-01

    Some effects of gamma radiation have been observed on the texture, flavor, and processing properties of fruits and vegetables. An appreciable reduction in mold decay of strawberries during storage at 20 and 1 deg C has been observed after treatment with 0.2 to 0.4 Mrad. Flavor was affected slightly and the viscosity of the juice was reduced considerably. The viscosity of juice extracted from irradiated whole berries, was reduced considerably by doses between 0.4 and 0.6 Mrad without critically affecting the flavor of the juice. The irradiated berries were free from a significant number of mold spores. No immediate commercialmore » application was foreseen in the black current juice industry. Plums were shown to be free from mold after a treatment of 0.4 Mrad and the initial spore count of apples is considerably reduced by a dose of 0.2 Mrad. Some evidence has been obtained that the softening of pears can be delayed by a dose of 0.3 and 0.5 Mrad. The published literature on the effects of radiation on other fruits and vegetables has been reviewed and commented upon. (auth)« less

  10. Sound propagation in street canyons: comparison between diffusely and geometrically reflecting boundaries

    PubMed

    Kang

    2000-03-01

    This paper systematically compares the sound fields in street canyons with diffusely and geometrically reflecting boundaries. For diffuse boundaries, a radiosity-based theoretical/computer model has been developed. For geometrical boundaries, the image source method has been used. Computations using the models show that there are considerable differences between the sound fields resulting from the two kinds of boundaries. By replacing diffuse boundaries with geometrical boundaries, the sound attenuation along the length becomes significantly less; the RT30 is considerably longer; and the extra attenuation caused by air or vegetation absorption is reduced. There are also some similarities between the sound fields under the two boundary conditions. For example, in both cases the sound attenuation along the length with a given amount of absorption is the highest if the absorbers are arranged on one boundary and the lowest if they are evenly distributed on all boundaries. Overall, the results suggest that, from the viewpoint of urban noise reduction, it is better to design the street boundaries as diffusely reflective rather than acoustically smooth.

  11. Risk assessment for ecotoxicity of pharmaceuticals--an emerging issue.

    PubMed

    Kar, Supratik; Roy, Kunal

    2012-03-01

    Existence of a large amount of pharmaceuticals and their active metabolites in the environment has recently been considered as one of the most serious concerns in environmental sciences. Large diversity of pharmaceuticals has been found in the environmental domain in considerable amounts that are not only destructive to environment but also fatal for human and animal fraternity. There is a considerable lack of knowledge about the environmental fate and quantification of a large number of pharmaceuticals. This communication aims to review the literature information regarding occurrence of pharmaceuticals and their metabolites in the environment, their persistence, environmental fate and toxicity as well as application of theoretical, non-experimental, non-animal, alternative and, in particular, in silico methods to provide information about the basic physicochemical and fate properties of pharmaceuticals to the environment. The reader will gain an overview of risk assessment strategies for ecotoxicity of pharmaceuticals and advances in application of quantitative structure-toxicity relationship (QSTR) in this field. This review justifies the need to develop more QSTR models for prediction of ecotoxicity of pharmaceuticals in order to reduce time and cost involvement in such exercise.

  12. Investigation of methods for calculating duration of lightsignal regulation cycle

    NASA Astrophysics Data System (ADS)

    Dorokhin, S. V.; Novikov, A. N.; Zelikov, V. A.; Strukov, Y. V.; Novikov, I. A.; Shevtsova, A. G.; Likhachev, D. V.

    2018-05-01

    The research objective is development of a new approach to determining of mode operation of traffic lights taking into consideration advanced characteristics of traffic flow. It will allow one to decrease transport delay significantly while a vehicle on the way and, using signal control, to increase main parameters on the whole, such as fuel consumption, travel time and traffic speed. The research shows that basic approaches, which are applied nowadays to determine main parameters of traffic lights, do not allow one to take into consideration a number of characteristics of traffic flow, so it leads to many challenges that appear as ineffective using of traffic lights. There is critical transport delay at many controlled crossroads so it can lead to emergence of traffic accidents. The research contributes to the knowledge, studying the experience of using these approaches and, on the basis of their improvement and development of new approaches, allowing one to reduce risks to a minimum. The study also provides an opportunity to expand the scope of further research in this area, combining and applying lessons learned.

  13. Challenges and gaps for energy planning models in the developing-world context

    NASA Astrophysics Data System (ADS)

    Debnath, Kumar Biswajit; Mourshed, Monjur

    2018-03-01

    Energy planning models (EPMs) support multi-criteria assessments of the impact of energy policies on the economy and environment. Most EPMs originated in developed countries and are primarily aimed at reducing greenhouse gas emissions while enhancing energy security. In contrast, most, if not all, developing countries are predominantly concerned with increasing energy access. Here, we review thirty-four widely used EPMs to investigate their applicability to developing countries and find an absence of consideration of the objectives, challenges, and nuances of the developing context. Key deficiencies arise from the lack of deliberation of the low energy demand resulting from lack of access and availability of supply. Other inadequacies include the lack of consideration of socio-economic nuances such as the prevalence of corruption and resulting cost inflation, the methods for adequately addressing the shortcomings in data quality, availability and adequacy, and the effects of climate change. We argue for further research on characterization and modelling of suppressed demand, climate change impacts, and socio-political feedback in developing countries, and the development of contextual EPMs.

  14. Collecting behavioural data using the world wide web: considerations for researchers

    PubMed Central

    Rhodes, S; Bowie, D; Hergenrather, K

    2003-01-01

    Objective: To identify and describe advantages, challenges, and ethical considerations of web based behavioural data collection. Methods: This discussion is based on the authors' experiences in survey development and study design, respondent recruitment, and internet research, and on the experiences of others as found in the literature. Results: The advantages of using the world wide web to collect behavioural data include rapid access to numerous potential respondents and previously hidden populations, respondent openness and full participation, opportunities for student research, and reduced research costs. Challenges identified include issues related to sampling and sample representativeness, competition for the attention of respondents, and potential limitations resulting from the much cited "digital divide", literacy, and disability. Ethical considerations include anonymity and privacy, providing and substantiating informed consent, and potential risks of malfeasance. Conclusions: Computer mediated communications, including electronic mail, the world wide web, and interactive programs will play an ever increasing part in the future of behavioural science research. Justifiable concerns regarding the use of the world wide web in research exist, but as access to, and use of, the internet becomes more widely and representatively distributed globally, the world wide web will become more applicable. In fact, the world wide web may be the only research tool able to reach some previously hidden population subgroups. Furthermore, many of the criticisms of online data collection are common to other survey research methodologies. PMID:12490652

  15. Introduction of a computer-based method for automated planning of reduction paths under consideration of simulated muscular forces.

    PubMed

    Buschbaum, Jan; Fremd, Rainer; Pohlemann, Tim; Kristen, Alexander

    2017-08-01

    Reduction is a crucial step in the surgical treatment of bone fractures. Finding an optimal path for restoring anatomical alignment is considered technically demanding because collisions as well as high forces caused by surrounding soft tissues can avoid desired reduction movements. The repetition of reduction movements leads to a trial-and-error process which causes a prolonged duration of surgery. By planning an appropriate reduction path-an optimal sequence of target-directed movements-these problems should be overcome. For this purpose, a computer-based method has been developed. Using the example of simple femoral shaft fractures, 3D models are generated out of CT images. A reposition algorithm aligns both fragments by reconstructing their broken edges. According to the criteria of a deduced planning strategy, a modified A*-algorithm searches collision-free route of minimal force from the dislocated into the computed target position. Muscular forces are considered using a musculoskeletal reduction model (OpenSim model), and bone collisions are detected by an appropriate method. Five femoral SYNBONE models were broken into different fracture classification types and were automatically reduced from ten randomly selected displaced positions. Highest mean translational and rotational error for achieving target alignment is [Formula: see text] and [Formula: see text]. Mean value and standard deviation of occurring forces are [Formula: see text] for M. tensor fasciae latae and [Formula: see text] for M. semitendinosus over all trials. These pathways are precise, collision-free, required forces are minimized, and thus regarded as optimal paths. A novel method for planning reduction paths under consideration of collisions and muscular forces is introduced. The results deliver additional knowledge for an appropriate tactical reduction procedure and can provide a basis for further navigated or robotic-assisted developments.

  16. Underworld: What we set out to do, How far did we get, What did we Learn ? (Invited)

    NASA Astrophysics Data System (ADS)

    Moresi, L. N.

    2013-12-01

    Underworld was conceived as a tool for modelling 3D lithospheric deformation coupled with the underlying / surrounding mantle flow. The challenges involved were to find a method capable of representing the complicated, non-linear, history dependent rheology of the near surface as well as being able to model mantle convection, and, simultaneously, to be able to solve the numerical system efficiently. Underworld is a hybrid particle / mesh code reminiscent of the particle-in-cell techniques from the early 1960s. The Underworld team (*) was not the first to use this approach, nor the last, but the team does have considerable experience and much has been learned along the way. The use of a finite element method as the underlying "cell" in which the Lagrangian particles are embedded considerably reduces errors associated with mapping material properties to the cells. The particles are treated as moving quadrature points in computing the stiffness matrix integrals. The decoupling of deformation markers from computation points allows the use of structured meshes, efficient parallel decompositions, and simple-to-code geometric multigrid solution methods. For a 3D code such efficiencies are very important. The elegance of the method is that it can be completely described in a couple of sentences. However, there are some limitations: it is not obvious how to retain this elegance for unstructured or adaptive meshes, arbitrary element types are not sufficiently well integrated by the simple quadrature approach, and swarms of particles representing volumes are usually an inefficient representation of surfaces. This will be discussed ! (*) Although not formally constituted, my co-conspirators in this exercise are listed as the Underworld team and I will reveal their true identities on the day.

  17. Thin Cloud Detection Method by Linear Combination Model of Cloud Image

    NASA Astrophysics Data System (ADS)

    Liu, L.; Li, J.; Wang, Y.; Xiao, Y.; Zhang, W.; Zhang, S.

    2018-04-01

    The existing cloud detection methods in photogrammetry often extract the image features from remote sensing images directly, and then use them to classify images into cloud or other things. But when the cloud is thin and small, these methods will be inaccurate. In this paper, a linear combination model of cloud images is proposed, by using this model, the underlying surface information of remote sensing images can be removed. So the cloud detection result can become more accurate. Firstly, the automatic cloud detection program in this paper uses the linear combination model to split the cloud information and surface information in the transparent cloud images, then uses different image features to recognize the cloud parts. In consideration of the computational efficiency, AdaBoost Classifier was introduced to combine the different features to establish a cloud classifier. AdaBoost Classifier can select the most effective features from many normal features, so the calculation time is largely reduced. Finally, we selected a cloud detection method based on tree structure and a multiple feature detection method using SVM classifier to compare with the proposed method, the experimental data shows that the proposed cloud detection program in this paper has high accuracy and fast calculation speed.

  18. Exact and Metaheuristic Approaches for a Bi-Objective School Bus Scheduling Problem

    PubMed Central

    Chen, Xiaopan; Kong, Yunfeng; Dang, Lanxue; Hou, Yane; Ye, Xinyue

    2015-01-01

    As a class of hard combinatorial optimization problems, the school bus routing problem has received considerable attention in the last decades. For a multi-school system, given the bus trips for each school, the school bus scheduling problem aims at optimizing bus schedules to serve all the trips within the school time windows. In this paper, we propose two approaches for solving the bi-objective school bus scheduling problem: an exact method of mixed integer programming (MIP) and a metaheuristic method which combines simulated annealing with local search. We develop MIP formulations for homogenous and heterogeneous fleet problems respectively and solve the models by MIP solver CPLEX. The bus type-based formulation for heterogeneous fleet problem reduces the model complexity in terms of the number of decision variables and constraints. The metaheuristic method is a two-stage framework for minimizing the number of buses to be used as well as the total travel distance of buses. We evaluate the proposed MIP and the metaheuristic method on two benchmark datasets, showing that on both instances, our metaheuristic method significantly outperforms the respective state-of-the-art methods. PMID:26176764

  19. Distributed processing of a GPS receiver network for a regional ionosphere map

    NASA Astrophysics Data System (ADS)

    Choi, Kwang Ho; Hoo Lim, Joon; Yoo, Won Jae; Lee, Hyung Keun

    2018-01-01

    This paper proposes a distributed processing method applicable to GPS receivers in a network to generate a regional ionosphere map accurately and reliably. For accuracy, the proposed method is operated by multiple local Kalman filters and Kriging estimators. Each local Kalman filter is applied to a dual-frequency receiver to estimate the receiver’s differential code bias and vertical ionospheric delays (VIDs) at different ionospheric pierce points. The Kriging estimator selects and combines several VID estimates provided by the local Kalman filters to generate the VID estimate at each ionospheric grid point. For reliability, the proposed method uses receiver fault detectors and satellite fault detectors. Each receiver fault detector compares the VID estimates of the same local area provided by different local Kalman filters. Each satellite fault detector compares the VID estimate of each local area with that projected from the other local areas. Compared with the traditional centralized processing method, the proposed method is advantageous in that it considerably reduces the computational burden of each single Kalman filter and enables flexible fault detection, isolation, and reconfiguration capability. To evaluate the performance of the proposed method, several experiments with field collected measurements were performed.

  20. Upgrading and performance of the SAO laser ranging system in Matera

    NASA Technical Reports Server (NTRS)

    Maddox, J.; Pearlman, M.; Throp, J.; Wohn, J.

    1983-01-01

    The performance of the SAO lasers was improved considerably in terms of accuracy, range noise, data yield, and reliability. With the narrower laser pulse (2.5-3.0 nsec) and a new analog pulse processing system, the systematic range errors were reduced to 3-5 cm and range noise has been reduced to 5-15 cm on low satellites and 10-18 cm on Lageos. Pulse repetition rate was increased to 30 ppm and considerable improvement has been made in signal-to-noise ratio by using a 3 Angstrom interference filter and by reducing the range gate window down to 200-400 nsec. The first upgraded system was installed in Arequipa, Peru in the spring of 1982. The second upgraded system is now in operation in Matera, Italy. The third system is expected to be installed in Israel during 1984.

  1. A Numerical Study on the Screening of Blast-Induced Waves for Reducing Ground Vibration

    NASA Astrophysics Data System (ADS)

    Park, Dohyun; Jeon, Byungkyu; Jeon, Seokwon

    2009-06-01

    Blasting is often a necessary part of mining and construction operations, and is the most cost-effective way to break rock, but blasting generates both noise and ground vibration. In urban areas, noise and vibration have an environmental impact, and cause structural damage to nearby structures. Various wave-screening methods have been used for many years to reduce blast-induced ground vibration. However, these methods have not been quantitatively studied for their reduction effect of ground vibration. The present study focused on the quantitative assessment of the effectiveness in vibration reduction of line-drilling as a screening method using a numerical method. Two numerical methods were used to analyze the reduction effect toward ground vibration, namely, the “distinct element method” and the “non-linear hydrocode.” The distinct element method, by particle flow code in two dimensions (PFC 2D), was used for two-dimensional parametric analyses, and some cases of two-dimensional analyses were analyzed three-dimensionally using AUTODYN 3D, the program of the non-linear hydrocode. To analyze the screening effectiveness of line-drilling, parametric analyses were carried out under various conditions, with the spacing, diameter of drill holes, distance between the blasthole and line-drilling, and the number of rows of drill holes, including their arrangement, used as parameters. The screening effectiveness was assessed via a comparison of the vibration amplitude between cases both with and without screening. Also, the frequency distribution of ground motion of the two cases was investigated through fast Fourier transform (FFT), with the differences also examined. From our study, it was concluded that line-drilling as a screening method of blast-induced waves was considerably effective under certain design conditions. The design details for field application have also been proposed.

  2. CB4-03: An Eye on the Future: A Review of Data Virtualization Techniques to Improve Research Analytics

    PubMed Central

    Richter, Jack; McFarland, Lela; Bredfeldt, Christine

    2012-01-01

    Background/Aims Integrating data across systems can be a daunting process. The traditional method of moving data to a common location, mapping fields with different formats and meanings, and performing data cleaning activities to ensure valid and reliable integration across systems can be both expensive and extremely time consuming. As the scope of needed research data increases, the traditional methodology may not be sustainable. Data Virtualization provides an alternative to traditional methods that may reduce the effort required to integrate data across disparate systems. Objective Our goal was to survey new methods in data integration, cloud computing, enterprise data management and virtual data management for opportunities to increase the efficiency of producing VDW and similar data sets. Methods Kaiser Permanente Information Technology (KPIT), in collaboration with the Mid-Atlantic Permanente Research Institute (MAPRI) reviewed methodologies in the burgeoning field of Data Virtualization. We identified potential strengths and weaknesses of new approaches to data integration. For each method, we evaluated its potential application for producing effective research data sets. Results Data Virtualization provides opportunities to reduce the amount of data movement required to integrate data sources on different platforms in order to produce research data sets. Additionally, Data Virtualization also includes methods for managing “fuzzy” matching used to match fields known to have poor reliability such as names, addresses and social security numbers. These methods could improve the efficiency of integrating state and federal data such as patient race, death, and tumors with internal electronic health record data. Discussion The emerging field of Data Virtualization has considerable potential for increasing the efficiency of producing research data sets. An important next step will be to develop a proof of concept project that will help us understand to benefits and drawbacks of these techniques.

  3. The effect of crash characteristics on cyclist injuries: An analysis of Virginia automobile-bicycle crash data.

    PubMed

    Robartes, Erin; Chen, T Donna

    2017-07-01

    This paper examines bicyclist, automobile driver, vehicle, environmental, and roadway characteristics that influence cyclist injury severity in order to determine which factors should be addressed to mitigate the worst bicyclist injuries. An ordered probit model is used to examine single bicycle-single vehicle crashes from Virginia police crash report data from 2010 to 2014. Five injury severity levels are considered: fatalities, severe injuries, minor or possible injuries, no apparent injuries, and no injury. The results of this study most notably found automobile driver intoxication to increase the probability of a cyclist fatality six fold and double the risk of a severe injury, while bicyclist intoxication increases the probability of a fatality by 36.7% and doubles the probability of severe injury. Additionally, bicycle and automobile speeds, obscured automobile driver vision, specific vehicle body types (SUV, truck, and van), vertical roadway grades and horizontal curves elevate the probability of more severe bicyclist injuries. Model results encourage consideration of methods to reduce the impact of biking and driving while intoxicated such as analysis of bicycling under the influence laws, education of drunk driving impacts on bicyclists, and separation of vehicles and bicycles on the road. Additionally, the results encourage consideration of methods to improve visibility of bicyclists and expectation of their presence on the road. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Analysis of hysteresis effect on the vibration motion of a bimodal non-uniform micro-cantilever using MCS theory

    NASA Astrophysics Data System (ADS)

    Korayem, M. H.; Korayem, A. H.; Hosseini Hashemi, Sh.

    2016-02-01

    Nowadays, to enhance the performance of atomic force microscopy (AFM) micro-cantilevers (MCs) during imaging, reduce costs and increase the surface topography precision, advanced MCs equipped with piezoelectric layers are utilized. Using the modified couple stress (MCS) theory not only makes the modeling more exhaustive, but also increases the accuracy of prediction of the vibration behavior of the system. In this paper, Hamilton's principle by consideration of the MCS theory has been used to extract the equations. In addition, to discretize the equations, differential quadrature method has been adopted. Analysis of the hysteresis effect on the vibration behavior of the AFM MC is of significant importance. Thus, to model the hysteresis effect, Bouc-Wen method, which is solved simultaneously with the vibration equations of non-uniform Timoshenko beam, has been utilized. Furthermore, a bimodal excitation of the MC has been considered. The results reveal that the hysteresis effect appears as a phase difference in the time response. Finally, the effect of the geometric parameters on the vibration frequency of the system which is excited by combination of the first two vibration modes of the non-uniform piezoelectric MC has been examined. The results indicate the considerable effect of the MC length in comparison with other geometric parameters such as the MC width and thickness.

  5. Efficient Segmentation of a Breast in B-Mode Ultrasound Tomography Using Three-Dimensional GrabCut (GC3D)

    PubMed Central

    Wu, Shibin; Zhuang, Ling; Wei, Xinhua; Sak, Mark; Neb, Duric; Hu, Jiani; Xie, Yaoqin

    2017-01-01

    As an emerging modality for whole breast imaging, ultrasound tomography (UST), has been adopted for diagnostic purposes. Efficient segmentation of an entire breast in UST images plays an important role in quantitative tissue analysis and cancer diagnosis, while major existing methods suffer from considerable time consumption and intensive user interaction. This paper explores three-dimensional GrabCut (GC3D) for breast isolation in thirty reflection (B-mode) UST volumetric images. The algorithm can be conveniently initialized by localizing points to form a polygon, which covers the potential breast region. Moreover, two other variations of GrabCut and an active contour method were compared. Algorithm performance was evaluated from volume overlap ratios (TO, target overlap; MO, mean overlap; FP, false positive; FN, false negative) and time consumption. Experimental results indicate that GC3D considerably reduced the work load and achieved good performance (TO = 0.84; MO = 0.91; FP = 0.006; FN = 0.16) within an average of 1.2 min per volume. Furthermore, GC3D is not only user friendly, but also robust to various inputs, suggesting its great potential to facilitate clinical applications during whole-breast UST imaging. In the near future, the implemented GC3D can be easily automated to tackle B-mode UST volumetric images acquired from the updated imaging system. PMID:28786946

  6. Transnasal endoscopy: Technical considerations, advantages and limitations.

    PubMed

    Atar, Mustafa; Kadayifci, Abdurrahman

    2014-02-16

    Transnasal endoscopy (TNE) is an upper endoscopy method which is performed by the nasal route using a thin endoscope less than 6 mm in diameter. The primary goal of this method is to improve patient tolerance and convenience of the procedure. TNE can be performed without sedation and thus eliminates the risks associated with general anesthesia. In this way, TNE decreases the cost and total duration of endoscopic procedures, while maintaining the image quality of standard caliber endoscopes, providing good results for diagnostic purposes. However, the small working channel of the ultra-thin endoscope used for TNE makes it difficult to use for therapeutic procedures except in certain conditions which require a thinner endoscope. Biopsy is possible with special forceps less than 2 mm in diameter. Recently, TNE has been used for screening endoscopy in Far East Asia, including Japan. In most controlled studies, TNE was found to have better patient tolerance when compared to unsedated endoscopy. Nasal pain is the most significant symptom associated with endoscopic procedures but can be reduced with nasal pretreatment. Despite the potential advantage of TNE, it is not common in Western countries, usually due to a lack of training in the technique and a lack of awareness of its potential advantages. This paper briefly reviews the technical considerations as well as the potential advantages and limitations of TNE with ultra-thin scopes.

  7. Transnasal endoscopy: Technical considerations, advantages and limitations

    PubMed Central

    Atar, Mustafa; Kadayifci, Abdurrahman

    2014-01-01

    Transnasal endoscopy (TNE) is an upper endoscopy method which is performed by the nasal route using a thin endoscope less than 6 mm in diameter. The primary goal of this method is to improve patient tolerance and convenience of the procedure. TNE can be performed without sedation and thus eliminates the risks associated with general anesthesia. In this way, TNE decreases the cost and total duration of endoscopic procedures, while maintaining the image quality of standard caliber endoscopes, providing good results for diagnostic purposes. However, the small working channel of the ultra-thin endoscope used for TNE makes it difficult to use for therapeutic procedures except in certain conditions which require a thinner endoscope. Biopsy is possible with special forceps less than 2 mm in diameter. Recently, TNE has been used for screening endoscopy in Far East Asia, including Japan. In most controlled studies, TNE was found to have better patient tolerance when compared to unsedated endoscopy. Nasal pain is the most significant symptom associated with endoscopic procedures but can be reduced with nasal pretreatment. Despite the potential advantage of TNE, it is not common in Western countries, usually due to a lack of training in the technique and a lack of awareness of its potential advantages. This paper briefly reviews the technical considerations as well as the potential advantages and limitations of TNE with ultra-thin scopes. PMID:24567791

  8. At-Risk Youth: Considerations for State-Level Policymakers, Including a Summary of Recent State-Level Actions in the Region.

    ERIC Educational Resources Information Center

    Regional Laboratory for Educational Improvement of the Northeast & Islands, Andover, MA.

    In response to the growing dropout problem, most of the northeastern states (and islands) and many of their local districts have implemented policies and programs designed to reduce the number of students leaving school. This paper presents some observations, culled from various reports, for policymakers' consideration. Section I offers a set of…

  9. Unsupervised Spatio-Temporal Data Mining Framework for Burned Area Mapping

    NASA Technical Reports Server (NTRS)

    Kumar, Vipin (Inventor); Boriah, Shyam (Inventor); Mithal, Varun (Inventor); Khandelwal, Ankush (Inventor)

    2016-01-01

    A method reduces processing time required to identify locations burned by fire by receiving a feature value for each pixel in an image, each pixel representing a sub-area of a location. Pixels are then grouped based on similarities of the feature values to form candidate burn events. For each candidate burn event, a probability that the candidate burn event is a true burn event is determined based on at least one further feature value for each pixel in the candidate burn event. Candidate burn events that have a probability below a threshold are removed from further consideration as burn events to produce a set of remaining candidate burn events.

  10. Inverse finite-size scaling for high-dimensional significance analysis

    NASA Astrophysics Data System (ADS)

    Xu, Yingying; Puranen, Santeri; Corander, Jukka; Kabashima, Yoshiyuki

    2018-06-01

    We propose an efficient procedure for significance determination in high-dimensional dependence learning based on surrogate data testing, termed inverse finite-size scaling (IFSS). The IFSS method is based on our discovery of a universal scaling property of random matrices which enables inference about signal behavior from much smaller scale surrogate data than the dimensionality of the original data. As a motivating example, we demonstrate the procedure for ultra-high-dimensional Potts models with order of 1010 parameters. IFSS reduces the computational effort of the data-testing procedure by several orders of magnitude, making it very efficient for practical purposes. This approach thus holds considerable potential for generalization to other types of complex models.

  11. Deep learning beyond Lefschetz thimbles

    NASA Astrophysics Data System (ADS)

    Alexandru, Andrei; Bedaque, Paulo F.; Lamm, Henry; Lawrence, Scott

    2017-11-01

    The generalized thimble method to treat field theories with sign problems requires repeatedly solving the computationally expensive holomorphic flow equations. We present a machine learning technique to bypass this problem. The central idea is to obtain a few field configurations via the flow equations to train a feed-forward neural network. The trained network defines a new manifold of integration which reduces the sign problem and can be rapidly sampled. We present results for the 1 +1 dimensional Thirring model with Wilson fermions on sizable lattices. In addition to the gain in speed, the parametrization of the integration manifold we use avoids the "trapping" of Monte Carlo chains which plagues large-flow calculations, a considerable shortcoming of the previous attempts.

  12. Optimal control applied to a model for species augmentation.

    PubMed

    Bodine, Erin N; Gross, Louis J; Lenhart, Suzanne

    2008-10-01

    Species augmentation is a method of reducing species loss via augmenting declining or threatened populations with individuals from captive-bred or stable, wild populations. In this paper, we develop a differential equations model and optimal control formulation for a continuous time augmentation of a general declining population. We find a characterization for the optimal control and show numerical results for scenarios of different illustrative parameter sets. The numerical results provide considerably more detail about the exact dynamics of optimal augmentation than can be readily intuited. The work and results presented in this paper are a first step toward building a general theory of population augmentation, which accounts for the complexities inherent in many conservation biology applications.

  13. Nanoscale materials for hyperthermal theranostics

    NASA Astrophysics Data System (ADS)

    Smith, Bennett E.; Roder, Paden B.; Zhou, Xuezhe; Pauzauskie, Peter J.

    2015-04-01

    Recently, the use of nanoscale materials has attracted considerable attention with the aim of designing personalized therapeutic approaches that can enhance both spatial and temporal control over drug release, permeability, and uptake. Potential benefits to patients include the reduction of overall drug dosages, enabling the parallel delivery of different pharmaceuticals, and the possibility of enabling additional functionalities such as hyperthermia or deep-tissue imaging (LIF, PET, etc.) that complement and extend the efficacy of traditional chemotherapy and surgery. This mini-review is focused on an emerging class of nanometer-scale materials that can be used both to heat malignant tissue to reduce angiogenesis and DNA-repair while simultaneously offering complementary imaging capabilities based on radioemission, optical fluorescence, magnetic resonance, and photoacoustic methods.

  14. The relation between rice consumption, arsenic contamination, and prevalence of diabetes in South Asia

    PubMed Central

    Hassan, Fatima Ismail; Niaz, Kamal; Khan, Fazlullah; Maqbool, Faheem; Abdollahi, Mohammad

    2017-01-01

    Rice is the major staple food for about two billion people living in Asia. It has been reported to contain considerable amount of inorganic arsenic which is toxic to pancreatic beta cells and disrupt glucose homeostasis. Articles and conference papers published between 1992 and 2017, indexed in Scopus, PubMed, EMBASE, Google, and Google scholar were used. Arsenic exposure has been associated with increased blood glucose and insulin levels, or decreased sensitization of insulin cells to glucose uptake. Several studies have shown the association between inorganic arsenic exposure and incidence of diabetes mellitus. Considerable amounts of arsenic have been reported in different types of rice which may be affected by cultivation methods, processing, and country of production. Use of certain microbes, fertilizers, and enzymes may reduce arsenic uptake or accumulation in rice, which may reduce its risk of toxicity. Combined exposure to contaminated rice, other foods and drinking water may increase the risk of diabetes in these countries. Maximum tolerated daily intake of arsenic contaminated rice (2.1 µg/day kg body weight) has been set by WHO, which may be exceeded depending on its content in rice and amount consumed. Hence, increased prevalence of diabetes in South Asia may be related to the consumption of arsenic contaminated rice depending on its content in the rice and daily amount consumed. In this review, we have focused on the possible relation between rice consumption, arsenic contamination, and prevalence of diabetes in South Asia. PMID:29285009

  15. A proposed new handbook for the Federal Emergency Management Agency: Radiation safety in shelters

    NASA Astrophysics Data System (ADS)

    Haaland, C. M.

    1981-09-01

    A proposed replacement for the portion of the current Handbook for Radiological Monitoring that deals with protection of people in shelters from radiation from fallout resulting from nuclear war is presented. Basic information at a high school level is given on how to detect nuclear radiation, how to find and improve the safest places in a shelter, the necessity for and how to keep records on individual radiation exposures, and how to minimize exposures. Several procedures are introduced, some of which are based more on theoretical considerations than on actual experiments. These procedures include: (1) the method of time averaging radiation readings taken with one instrument in different locations of a large shelter while fallout is coming down and radiation levels ar climbing too rapidly for direct comparison of readings to determine the safest location; (2) the method of using one's own body to obtain directionality in radiation readings taken with a standard Civil Defense survey meter; (3) the method of using mutual shielding to reduce the average radiation exposure to shelter occupants; and (4) the ratio method for estimating radiation levels in hazardous areas.

  16. Computer analysis of ATR-FTIR spectra of paint samples for forensic purposes

    NASA Astrophysics Data System (ADS)

    Szafarska, Małgorzata; Woźniakiewicz, Michał; Pilch, Mariusz; Zięba-Palus, Janina; Kościelniak, Paweł

    2009-04-01

    A method of subtraction and normalization of IR spectra (MSN-IR) was developed and successfully applied to extract mathematically the pure paint spectrum from the spectrum of paint coat on different bases, both acquired by the Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) technique. The method consists of several stages encompassing several normalization and subtraction processes. The similarity of the spectrum obtained with the reference spectrum was estimated by means of the normalized Manhattan distance. The utility and performance of the method proposed were tested by examination of five different paints sprayed on plastic (polyester) foil and on fabric materials (cotton). It was found that the numerical algorithm applied is able - in contrast to other mathematical approaches conventionally used for the same aim - to reconstruct a pure paint IR spectrum effectively without a loss of chemical information provided. The approach allows the physical separation of a paint from a base to be avoided, hence a time and work-load of analysis to be considerably reduced. The results obtained prove that the method can be considered as a useful tool which can be applied to forensic purposes.

  17. Mouse Embryo Cryopreservation by Rapid Cooling.

    PubMed

    Shaw, Jillian

    2018-05-01

    Embryo cryopreservation has been used to archive mouse strains. Protocols have evolved over this time and now vary considerably in terms of cryoprotectant solution, cooling and warming rates, methods to add and remove cryoprotectant, container or carrier type, volume of cryoprotectant, the stage of preimplantation development, and the use of additional treatments such as blastocyst puncture and microinjection. The rapid cooling methods use concentrated solutions of cryoprotectants to reduce the water content of the cell before cooling commences, thus preventing the formation of ice crystals. Embryos are equilibrated with the cryoprotectants, loaded into a carrier, and then rapidly cooled (e.g., by being plunged directly into LN 2 or onto a surface cooled in LN 2 ). The rapid cooling methods eliminate the need for controlled-rate freezers and seeding procedures. However, they are much more sensitive to minor variations when performing the steps. The rapid-cooling protocol described here is suitable for use with plastic insemination straws. Because it uses relatively large volumes, it is less technically demanding than some other methods that use minivolume devices. © 2018 Cold Spring Harbor Laboratory Press.

  18. Commissioning of full energy scanning irradiation with carbon-ion beams ranging from 55.6 to 430 MeV/u at the NIRS-HIMAC

    NASA Astrophysics Data System (ADS)

    Hara, Y.; Furukawa, T.; Mizushima, K.; Inaniwa, T.; Saotome, N.; Tansho, R.; Saraya, Y.; Shirai, T.; Noda, K.

    2017-09-01

    Since 2011, a three-dimensional (3D) scanning irradiation system has been utilized for treatments at the National Institute of Radiological Sciences-Heavy Ion Medical Accelerator in Chiba (NIRS-HIMAC). In 2012, a hybrid depth scanning method was introduced for the depth direction, in which 11 discrete beam energies are used in conjunction with the range shifter. To suppress beam spread due to multiple scattering and nuclear reactions, we then developed a full energy scanning method. Accelerator tuning and beam commissioning tests prior to a treatment with this method are time-consuming, however. We therefore devised a new approach to obtain the pencil beam dataset, including consideration of the contribution of large-angle scattered (LAS) particles, which reduces the time spent on beam data preparation. The accuracy of 3D dose delivery using this new approach was verified by measuring the dose distributions for different target volumes. Results confirmed that the measured dose distributions agreed well with calculated doses. Following this evaluation, treatments using the full energy scanning method were commenced in September 2015.

  19. Numerical Procedures for Inlet/Diffuser/Nozzle Flows

    NASA Technical Reports Server (NTRS)

    Rubin, Stanley G.

    1998-01-01

    Two primitive variable, pressure based, flux-split, RNS/NS solution procedures for viscous flows are presented. Both methods are uniformly valid across the full Mach number range, Le., from the incompressible limit to high supersonic speeds. The first method is an 'optimized' version of a previously developed global pressure relaxation RNS procedure. Considerable reduction in the number of relatively expensive matrix inversion, and thereby in the computational time, has been achieved with this procedure. CPU times are reduced by a factor of 15 for predominantly elliptic flows (incompressible and low subsonic). The second method is a time-marching, 'linearized' convection RNS/NS procedure. The key to the efficiency of this procedure is the reduction to a single LU inversion at the inflow cross-plane. The remainder of the algorithm simply requires back-substitution with this LU and the corresponding residual vector at any cross-plane location. This method is not time-consistent, but has a convective-type CFL stability limitation. Both formulations are robust and provide accurate solutions for a variety of internal viscous flows to be provided herein.

  20. High-Throughput Fabrication Method for Producing a Silver-Nanoparticles-Doped Nanoclay Polymer Composite with Novel Synergistic Antibacterial Effects at the Material Interface.

    PubMed

    Cai, Shaobo; Pourdeyhimi, Behnam; Loboa, Elizabeth G

    2017-06-28

    In this study, we report a high-throughput fabrication method at industrial pilot scale to produce a silver-nanoparticles-doped nanoclay-polylactic acid composite with a novel synergistic antibacterial effect. The obtained nanocomposite has a significantly lower affinity for bacterial adhesion, allowing the loading amount of silver nanoparticles to be tremendously reduced while maintaining satisfactory antibacterial efficacy at the material interface. This is a great advantage for many antibacterial applications in which cost is a consideration. Furthermore, unlike previously reported methods that require additional chemical reduction processes to produce the silver-nanoparticles-doped nanoclay, an in situ preparation method was developed in which silver nanoparticles were created simultaneously during the composite fabrication process by thermal reduction. This is the first report to show that altered material surface submicron structures created with the loading of nanoclay enables the creation of a nanocomposite with significantly lower affinity for bacterial adhesion. This study provides a promising scalable approach to produce antibacterial polymeric products with minimal changes to industry standard equipment, fabrication processes, or raw material input cost.

  1. Sterility method of pest control and its potential role in an integrated sea lamprey (Petromyzon marinus) control program

    USGS Publications Warehouse

    Hanson, Lee H.; Manion, Patrick J.

    1980-01-01

    The sterility method of pest control could be an effective tool in the sea lamprey (Petromyzon marinus) control program in the Great Lakes. Some of the requirements for its successful application have been met. A field study demonstrated that the release of male sea lampreys, sterilized by the injection of 100 mg/kg of P,P-bis(1-aziridinyl)-N-methylphosphinothioic amide (bisazir), will reduce the number of viable larvae produced. The actual reduction in reproductive success that occurred was directly related to the ratio of sterile to normal males in the population. The technique can be used in many ways in an integrated control program and has considerable potential for the more effective control of the sea lamprey. Eradication is a distinct possibility.Key words: sea lamprey, Petromyzon marinus; pest control, fish control, sterile-male technique, sterilization, chemosterilants, bisazir, Great Lakes

  2. Evaluation of Bending Strength of Carburized Gears Based on Inferential Identification of Principal Surface Layer Defects

    NASA Astrophysics Data System (ADS)

    Masuyama, Tomoya; Inoue, Katsumi; Yamanaka, Masashi; Kitamura, Kenichi; Saito, Tomoyuki

    High load capacity of carburized gears originates mainly from the hardened layer and induced residual stress. On the other hand, surface decarburization, which causes a nonmartensitic layer, and inclusions such as oxides and segregation act as latent defects which considerably reduce fatigue strength. In this connection, the authors have proposed a formula of strength evaluation by separately quantifying defect influence. However, the principal defect which limits strength of gears with several different defects remains unclarified. This study presents a method of inferential identification of principal defects based on test results of carburized gears made of SCM420 clean steel, gears with both an artificial notch and nonmartensitic layer at the tooth fillet, and so forth. It clarifies practical uses of presented methods, and strength of carburized gears can be evaluated by focusing on principal defect size.

  3. Computational Reduction of Specimen Noise to Enable Improved Thermography Characterization of Flaws in Graphite Polymer Composites

    NASA Technical Reports Server (NTRS)

    Winfree, William P.; Howell, Patricia A.; Zalameda, Joseph N.

    2014-01-01

    Flaw detection and characterization with thermographic techniques in graphite polymer composites are often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These result in a "noise" floor that increases the difficulty of detecting and characterizing deeper flaws. A method is presented for computationally removing a significant amount of the "noise" from near surface porosity by diffusing the early time response, then subtracting it from subsequent responses. Simulations of the thermal response of a composite are utilized in defining the limitations of the technique. This method for reducing the data is shown to give considerable improvement characterizing both the size and depth of damage. Examples are shown for data acquired on specimens with fabricated delaminations and impact damage.

  4. The evolution of image-guided lumbosacral spine surgery.

    PubMed

    Bourgeois, Austin C; Faulkner, Austin R; Pasciak, Alexander S; Bradley, Yong C

    2015-04-01

    Techniques and approaches of spinal fusion have considerably evolved since their first description in the early 1900s. The incorporation of pedicle screw constructs into lumbosacral spine surgery is among the most significant advances in the field, offering immediate stability and decreased rates of pseudarthrosis compared to previously described methods. However, early studies describing pedicle screw fixation and numerous studies thereafter have demonstrated clinically significant sequelae of inaccurate surgical fusion hardware placement. A number of image guidance systems have been developed to reduce morbidity from hardware malposition in increasingly complex spine surgeries. Advanced image guidance systems such as intraoperative stereotaxis improve the accuracy of pedicle screw placement using a variety of surgical approaches, however their clinical indications and clinical impact remain debated. Beginning with intraoperative fluoroscopy, this article describes the evolution of image guided lumbosacral spinal fusion, emphasizing two-dimensional (2D) and three-dimensional (3D) navigational methods.

  5. Structural Optimization of a Knuckle with Consideration of Stiffness and Durability Requirements

    PubMed Central

    Kim, Geun-Yeon

    2014-01-01

    The automobile's knuckle is connected to the parts of the steering system and the suspension system and it is used for adjusting the direction of a rotation through its attachment to the wheel. This study changes the existing material made of GCD45 to Al6082M and recommends the lightweight design of the knuckle as the optimal design technique to be installed in small cars. Six shape design variables were selected for the optimization of the knuckle and the criteria relevant to stiffness and durability were considered as the design requirements during the optimization process. The metamodel-based optimization method that uses the kriging interpolation method as the optimization technique was applied. The result shows that all constraints for stiffness and durability are satisfied using A16082M, while reducing the weight of the knuckle by 60% compared to that of the existing GCD450. PMID:24995359

  6. A partially reflecting random walk on spheres algorithm for electrical impedance tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maire, Sylvain, E-mail: maire@univ-tln.fr; Simon, Martin, E-mail: simon@math.uni-mainz.de

    2015-12-15

    In this work, we develop a probabilistic estimator for the voltage-to-current map arising in electrical impedance tomography. This novel so-called partially reflecting random walk on spheres estimator enables Monte Carlo methods to compute the voltage-to-current map in an embarrassingly parallel manner, which is an important issue with regard to the corresponding inverse problem. Our method uses the well-known random walk on spheres algorithm inside subdomains where the diffusion coefficient is constant and employs replacement techniques motivated by finite difference discretization to deal with both mixed boundary conditions and interface transmission conditions. We analyze the global bias and the variance ofmore » the new estimator both theoretically and experimentally. Subsequently, the variance of the new estimator is considerably reduced via a novel control variate conditional sampling technique which yields a highly efficient hybrid forward solver coupling probabilistic and deterministic algorithms.« less

  7. Computational reduction of specimen noise to enable improved thermography characterization of flaws in graphite polymer composites

    NASA Astrophysics Data System (ADS)

    Winfree, William P.; Howell, Patricia A.; Zalameda, Joseph N.

    2014-05-01

    Flaw detection and characterization with thermographic techniques in graphite polymer composites are often limited by localized variations in the thermographic response. Variations in properties such as acceptable porosity, fiber volume content and surface polymer thickness result in variations in the thermal response that in general cause significant variations in the initial thermal response. These result in a "noise" floor that increases the difficulty of detecting and characterizing deeper flaws. A method is presented for computationally removing a significant amount of the "noise" from near surface porosity by diffusing the early time response, then subtracting it from subsequent responses. Simulations of the thermal response of a composite are utilized in defining the limitations of the technique. This method for reducing the data is shown to give considerable improvement characterizing both the size and depth of damage. Examples are shown for data acquired on specimens with fabricated delaminations and impact damage.

  8. Performance tradeoffs in static and dynamic load balancing strategies

    NASA Technical Reports Server (NTRS)

    Iqbal, M. A.; Saltz, J. H.; Bokhart, S. H.

    1986-01-01

    The problem of uniformly distributing the load of a parallel program over a multiprocessor system was considered. A program was analyzed whose structure permits the computation of the optimal static solution. Then four strategies for load balancing were described and their performance compared. The strategies are: (1) the optimal static assignment algorithm which is guaranteed to yield the best static solution, (2) the static binary dissection method which is very fast but sub-optimal, (3) the greedy algorithm, a static fully polynomial time approximation scheme, which estimates the optimal solution to arbitrary accuracy, and (4) the predictive dynamic load balancing heuristic which uses information on the precedence relationships within the program and outperforms any of the static methods. It is also shown that the overhead incurred by the dynamic heuristic is reduced considerably if it is started off with a static assignment provided by either of the other three strategies.

  9. Sub-Selective Quantization for Learning Binary Codes in Large-Scale Image Search.

    PubMed

    Li, Yeqing; Liu, Wei; Huang, Junzhou

    2018-06-01

    Recently with the explosive growth of visual content on the Internet, large-scale image search has attracted intensive attention. It has been shown that mapping high-dimensional image descriptors to compact binary codes can lead to considerable efficiency gains in both storage and performing similarity computation of images. However, most existing methods still suffer from expensive training devoted to large-scale binary code learning. To address this issue, we propose a sub-selection based matrix manipulation algorithm, which can significantly reduce the computational cost of code learning. As case studies, we apply the sub-selection algorithm to several popular quantization techniques including cases using linear and nonlinear mappings. Crucially, we can justify the resulting sub-selective quantization by proving its theoretic properties. Extensive experiments are carried out on three image benchmarks with up to one million samples, corroborating the efficacy of the sub-selective quantization method in terms of image retrieval.

  10. Metal nanoparticles (other than gold or silver) prepared using plant extracts for medical applications

    NASA Astrophysics Data System (ADS)

    Pasca, Roxana-Diana; Santa, Szabolcs; Racz, Levente Zsolt; Racz, Csaba Pal

    2016-12-01

    There are many modalities to prepare metal nanoparticles, but the reducing of the metal ions with plant extracts is one of the most promising because it is considerate less toxic for the environment, suitable for the use of those nanoparticles in vivo and not very expensive. Various metal ions have been already studied such as: cobalt, copper, iron, platinum, palladium, zinc, indium, manganese and mercury and the number of plant extracts used is continuously increasing. The prepared systems were characterized afterwards with a great number of methods of investigation: both spectroscopic (especially UV-Vis spectroscopy) and microscopic (in principal, electron microscopy-TEM) methods. The applications of the metal nanoparticles obtained are diverse and not completely known, but the medical applications of such nanoparticles occupy a central place, due to their nontoxic components, but some diverse industrial applications do not have to be forgotten.

  11. Optimal estimation of spatially variable recharge and transmissivity fields under steady-state groundwater flow. Part 1. Theory

    NASA Astrophysics Data System (ADS)

    Graham, Wendy D.; Tankersley, Claude D.

    1994-05-01

    Stochastic methods are used to analyze two-dimensional steady groundwater flow subject to spatially variable recharge and transmissivity. Approximate partial differential equations are developed for the covariances and cross-covariances between the random head, transmissivity and recharge fields. Closed-form solutions of these equations are obtained using Fourier transform techniques. The resulting covariances and cross-covariances can be incorporated into a Bayesian conditioning procedure which provides optimal estimates of the recharge, transmissivity and head fields given available measurements of any or all of these random fields. Results show that head measurements contain valuable information for estimating the random recharge field. However, when recharge is treated as a spatially variable random field, the value of head measurements for estimating the transmissivity field can be reduced considerably. In a companion paper, the method is applied to a case study of the Upper Floridan Aquifer in NE Florida.

  12. Illicit cigarette trade in Thailand.

    PubMed

    Pavananunt, Pirudee

    2011-11-01

    The sale and consumption of illicit tobacco increases consumption, impacts public health, reduces tax revenue and provides an argument against tax increases. Thailand has some of the best tobacco control policies in Southeast Asia with one of the highest tobacco tax rates, but illicit trade has the potential to undermine these policies and needs investigating. Two approaches were used to assess illicit trade between 1991 and 2006: method 1, comparison of tobacco used based on tobacco taxes paid and survey data, and method 2, discrepancies between export data from countries exporting tobacco to Thailand and Thai official data regarding imports. A three year average was used to smooth differences due to lags between exports and imports. For 1991-2006, the estimated manufactured cigarette consumption from survey data was considerably lower than sales tax paid, so method 1 did not provide evidence of cigarette tax avoidance. Using method 2 the trade difference between reported imports and exports, indicates 10% of cigarettes consumed in Thailand (242 million packs per year) between 2004 and 2006 were illicit. The loss of revenue amounted to 4,508 million Baht (2002 prices) in the same year, that was 14% of the total cigarette tax revenue. Cigarette excise tax rates had a negative relationship with consumption trends but no relation with the level of illicit trade. There is a need for improved policies against smuggling to combat the rise in illicit tobacco consumption. Regional coordination and implementation of protocols on illicit trade would help reduce incentives for illegal tax avoidance.

  13. ILLICIT CIGARETTE TRADE IN THAILAND

    PubMed Central

    Pavananunt, Pirudee

    2012-01-01

    The sale and consumption of illicit tobacco increases consumption, impacts public health, reduces tax revenue and provides an argument against tax increases. Thailand has some of the best tobacco control policies in Southeast Asia with one of the highest tobacco tax rates, but illicit trade has the potential to undermine these policies and needs investigating. Two approaches were used to assess illicit trade between 1991 and 2006: method 1, comparison of tobacco used based on tobacco taxes paid and survey data, and method 2, discrepancies between export data from countries exporting tobacco to Thailand and Thai official data regarding imports. A three year average was used to smooth differences due to lags between exports and imports. For 1991–2006, the estimated manufactured cigarette consumption from survey data was considerably lower than sales tax paid, so method 1 did not provide evidence of cigarette tax avoidance. Using method 2 the trade difference between reported imports and exports, indicates 10% of cigarettes consumed in Thailand (242 million packs per year) between 2004 and 2006 were illicit. The loss of revenue amounted to 4,508 million Baht (2002 prices) in the same year, that was 14% of the total cigarette tax revenue. Cigarette excise tax rates had a negative relationship with consumption trends but no relation with the level of illicit trade. There is a need for improved policies against smuggling to combat the rise in illicit tobacco consumption. Regional coordination and implementation of protocols on illicit trade would help reduce incentives for illegal tax avoidance. PMID:22299425

  14. A novel post-processing scheme for two-dimensional electrical impedance tomography based on artificial neural networks

    PubMed Central

    2017-01-01

    Objective Electrical Impedance Tomography (EIT) is a powerful non-invasive technique for imaging applications. The goal is to estimate the electrical properties of living tissues by measuring the potential at the boundary of the domain. Being safe with respect to patient health, non-invasive, and having no known hazards, EIT is an attractive and promising technology. However, it suffers from a particular technical difficulty, which consists of solving a nonlinear inverse problem in real time. Several nonlinear approaches have been proposed as a replacement for the linear solver, but in practice very few are capable of stable, high-quality, and real-time EIT imaging because of their very low robustness to errors and inaccurate modeling, or because they require considerable computational effort. Methods In this paper, a post-processing technique based on an artificial neural network (ANN) is proposed to obtain a nonlinear solution to the inverse problem, starting from a linear solution. While common reconstruction methods based on ANNs estimate the solution directly from the measured data, the method proposed here enhances the solution obtained from a linear solver. Conclusion Applying a linear reconstruction algorithm before applying an ANN reduces the effects of noise and modeling errors. Hence, this approach significantly reduces the error associated with solving 2D inverse problems using machine-learning-based algorithms. Significance This work presents radical enhancements in the stability of nonlinear methods for biomedical EIT applications. PMID:29206856

  15. Fast learning method for convolutional neural networks using extreme learning machine and its application to lane detection.

    PubMed

    Kim, Jihun; Kim, Jonghong; Jang, Gil-Jin; Lee, Minho

    2017-03-01

    Deep learning has received significant attention recently as a promising solution to many problems in the area of artificial intelligence. Among several deep learning architectures, convolutional neural networks (CNNs) demonstrate superior performance when compared to other machine learning methods in the applications of object detection and recognition. We use a CNN for image enhancement and the detection of driving lanes on motorways. In general, the process of lane detection consists of edge extraction and line detection. A CNN can be used to enhance the input images before lane detection by excluding noise and obstacles that are irrelevant to the edge detection result. However, training conventional CNNs requires considerable computation and a big dataset. Therefore, we suggest a new learning algorithm for CNNs using an extreme learning machine (ELM). The ELM is a fast learning method used to calculate network weights between output and hidden layers in a single iteration and thus, can dramatically reduce learning time while producing accurate results with minimal training data. A conventional ELM can be applied to networks with a single hidden layer; as such, we propose a stacked ELM architecture in the CNN framework. Further, we modify the backpropagation algorithm to find the targets of hidden layers and effectively learn network weights while maintaining performance. Experimental results confirm that the proposed method is effective in reducing learning time and improving performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Method of multi-mode vibration control for the carbody of high-speed electric multiple unit trains

    NASA Astrophysics Data System (ADS)

    Gong, Dao; Zhou, Jinsong; Sun, Wenjing; Sun, Yu; Xia, Zhanghui

    2017-11-01

    A method of multi-mode vibration control for the carbody of high-speed electric multiple unit (EMU) trains by using the onboard and suspended equipments as dynamic vibration absorbers (DVAs) is proposed. The effect of the multi-mode vibration on the ride quality of a high-speed EMU train was studied, and the target modes of vibration control were determined. An equivalent mass identification method was used to determine the equivalent mass for the target modes at the device installation positions. To optimize the vibration acceleration response of the carbody, the natural frequencies and damping ratios of the lateral and vertical vibration were designed based on the theory of dynamic vibration absorption. In order to realize the optimized design values of the natural frequencies for the lateral and vertical vibrations simultaneously, a new type of vibration absorber was designed in which a belleville spring and conventional rubber parts are connected in parallel. This design utilizes the negative stiffness of the belleville spring. Results show that, as compared to rigid equipment connections, the proposed method effectively reduces the multi-mode vibration of a carbody in a high-speed EMU train, thereby achieving the control objectives. The ride quality in terms of the lateral and vertical vibration of the carbody is considerably improved. Moreover, the optimal value of the damping ratio is effective in dissipating the vibration energy, which reduces the vibration of both the carbody and the equipment.

  17. Integrated NTP Vehicle Radiation Design

    NASA Technical Reports Server (NTRS)

    Caffrey, Jarvis A.; Rodriquez, Mitchell A.

    2018-01-01

    The development of a nuclear thermal propulsion stage requires consideration for radiation emitted from the nuclear reactor core. Applying shielding mass is an effective mitigating solution, but a better alternative is to incorporate some mitigation strategies into the propulsion stage and crew habitat. In this way, the required additional mass is minimized and the mass that must be applied may in some cases be able to serve multiple purposes. Strategies for crew compartment shielding are discussed that reduce dose from both engine and cosmic sources, and in some cases may also serve to reduce life support risks by permitting abundant water reserves. Early consideration for integrated mitigation solutions in a crewed nuclear thermal propulsion (NTP) vehicle will enable reduced radiation burden from both cosmic and nuclear sources, improved thrust-to-weight ratio or payload capacity by reducing 'dead mass' of shielding, and generally support a more robust risk posture for a NTP-powered Mars mission by permitting shorter trip times and increased water reserves.

  18. Integrated NTP Vehicle Radiation Design

    NASA Technical Reports Server (NTRS)

    Caffrey, Jarvis; Rodriquez, Mitchell

    2018-01-01

    The development of a nuclear thermal propulsion stage requires consideration for radiation emitted from the nuclear reactor core. Applying shielding mass is an effective mitigating solution, but a better alternative is to incorporate some mitigation strategies into the propulsion stage and crew habitat. In this way, the required additional mass is minimized and the mass that must be applied may in some cases be able to serve multiple purposes. Strategies for crew compartment shielding are discussed that reduce dose from both engine and cosmic sources, and in some cases may also serve to reduce life support risks by permitting abundant water reserves. Early consideration for integrated mitigation solutions in a crewed nuclear thermal propulsion (NTP) vehicle will enable reduced radiation burden from both cosmic and nuclear sources, improved thrust-to-weight ratio or payload capacity by reducing 'dead mass' of shielding, and generally support a more robust risk posture for a NTP-powered Mars mission by permitting shorter trip times and increased water reserves

  19. Using E-Cigarettes in the Home to Reduce Smoking and Secondhand Smoke: Disadvantaged Parents' Accounts

    ERIC Educational Resources Information Center

    Rowa-Dewar, Neneh; Rooke, Catriona; Amos, Amanda

    2017-01-01

    Electronic cigarettes (e-cigarettes) are subject to considerable public health debate. Most public health experts agree that for smokers who find it particularly challenging to quit, e-cigarettes may reduce harm. E-cigarette use in the home may also reduce children's secondhand smoke (SHS) exposure, although e-cigarette vapour may pose risks. This…

  20. Measurement of the total antioxidant response in preeclampsia with a novel automated method.

    PubMed

    Harma, Mehmet; Harma, Muge; Erel, Ozcan

    2005-01-10

    Preeclampsia is one of the most serious complications of pregnancy. Free radical damage has been implicated in the pathophysiology of this condition. In this study, we aimed to measure the antioxidant capacity in plasma samples from normotensive and preeclamptic pregnant women to evaluate their antioxidant status using a more recently developed automated measurement method. Our study group contained 42 women, 24 of whom had preeclampsia, while 18 had normotensive pregnancies. We measured the total plasma antioxidant capacity for all patients, as well as the levels of four major individual plasma antioxidant components; albumin, uric acid, ascorbic acid and bilirubin, and as a reciprocal measure, their total plasma peroxide levels. Statistically significant differences (determined using Student's t-test) were noted between the normotensive and the preeclamptic groups for their total antioxidant responses and their vitamin C levels (1.31 +/- 0.12 mmol versus 1.06 +/- 0.41 mmol Trolox eq./L; 30.2 +/- 17.83 micromol/L versus 18.1 +/- 11.37 micromol/L, respectively), which were both considerably reduced in the preeclamptic patients. In contrast, the total plasma peroxide levels were significantly elevated in this group (49.8 +/- 14.3 micromol/L versus 38.8 +/- 9.6 micromol/L). We found a decreased total antioxidant response in preeclamptic patients using a simple, rapid and reliable automated colorimetric assay, which may suitable for use in any routine clinical biochemistry laboratory, and considerably facilitates the assessment of this useful clinical parameter. We suggest that this novel method may be used as a routine test to evaluate and follow up of the levels of oxidative stress in preeclampsia.

  1. Spatial filters and automated spike detection based on brain topographies improve sensitivity of EEG-fMRI studies in focal epilepsy.

    PubMed

    Siniatchkin, Michael; Moeller, Friederike; Jacobs, Julia; Stephani, Ulrich; Boor, Rainer; Wolff, Stephan; Jansen, Olav; Siebner, Hartwig; Scherg, Michael

    2007-09-01

    The ballistocardiogram (BCG) represents one of the most prominent sources of artifacts that contaminate the electroencephalogram (EEG) during functional MRI. The BCG artifacts may affect the detection of interictal epileptiform discharges (IED) in patients with epilepsy, reducing the sensitivity of the combined EEG-fMRI method. In this study we improved the BCG artifact correction using a multiple source correction (MSC) approach. On the one hand, a source analysis of the IEDs was applied to the EEG data obtained outside the MRI scanner to prevent the distortion of EEG signals of interest during the correction of BCG artifacts. On the other hand, the topographies of the BCG artifacts were defined based on the EEG recorded inside the scanner. The topographies of the BCG artifacts were then added to the surrogate model of IED sources and a combined source model was applied to the data obtained inside the scanner. The artifact signal was then subtracted without considerable distortion of the IED topography. The MSC approach was compared with the traditional averaged artifact subtraction (AAS) method. Both methods reduced the spectral power of BCG-related harmonics and enabled better detection of IEDs. Compared with the conventional AAS method, the MSC approach increased the sensitivity of IED detection because the IED signal was less attenuated when subtracting the BCG artifacts. The proposed MSC method is particularly useful in situations in which the BCG artifact is spatially correlated and time-locked with the EEG signal produced by the focal brain activity of interest.

  2. Metallic artifact mitigation and organ-constrained tissue assignment for Monte Carlo calculations of permanent implant lung brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sutherland, J. G. H.; Miksys, N.; Thomson, R. M., E-mail: rthomson@physics.carleton.ca

    2014-01-15

    Purpose: To investigate methods of generating accurate patient-specific computational phantoms for the Monte Carlo calculation of lung brachytherapy patient dose distributions. Methods: Four metallic artifact mitigation methods are applied to six lung brachytherapy patient computed tomography (CT) images: simple threshold replacement (STR) identifies high CT values in the vicinity of the seeds and replaces them with estimated true values; fan beam virtual sinogram replaces artifact-affected values in a virtual sinogram and performs a filtered back-projection to generate a corrected image; 3D median filter replaces voxel values that differ from the median value in a region of interest surrounding the voxelmore » and then applies a second filter to reduce noise; and a combination of fan beam virtual sinogram and STR. Computational phantoms are generated from artifact-corrected and uncorrected images using several tissue assignment schemes: both lung-contour constrained and unconstrained global schemes are considered. Voxel mass densities are assigned based on voxel CT number or using the nominal tissue mass densities. Dose distributions are calculated using the EGSnrc user-code BrachyDose for{sup 125}I, {sup 103}Pd, and {sup 131}Cs seeds and are compared directly as well as through dose volume histograms and dose metrics for target volumes surrounding surgical sutures. Results: Metallic artifact mitigation techniques vary in ability to reduce artifacts while preserving tissue detail. Notably, images corrected with the fan beam virtual sinogram have reduced artifacts but residual artifacts near sources remain requiring additional use of STR; the 3D median filter removes artifacts but simultaneously removes detail in lung and bone. Doses vary considerably between computational phantoms with the largest differences arising from artifact-affected voxels assigned to bone in the vicinity of the seeds. Consequently, when metallic artifact reduction and constrained tissue assignment within lung contours are employed in generated phantoms, this erroneous assignment is reduced, generally resulting in higher doses. Lung-constrained tissue assignment also results in increased doses in regions of interest due to a reduction in the erroneous assignment of adipose to voxels within lung contours. Differences in dose metrics calculated for different computational phantoms are sensitive to radionuclide photon spectra with the largest differences for{sup 103}Pd seeds and smallest but still considerable differences for {sup 131}Cs seeds. Conclusions: Despite producing differences in CT images, dose metrics calculated using the STR, fan beam + STR, and 3D median filter techniques produce similar dose metrics. Results suggest that the accuracy of dose distributions for permanent implant lung brachytherapy is improved by applying lung-constrained tissue assignment schemes to metallic artifact corrected images.« less

  3. Subsidence from underground mining; environmental analysis and planning considerations

    USGS Publications Warehouse

    Lee, Fitzhugh T.; Abel, John F.

    1983-01-01

    Subsidence, a universal process that occurs in response to the voids created by extracting solids or liquids from beneath the Earth's surface, is controlled by many factors including mining methods, depth of extraction, thickness of deposit, and topography, as well as the in situ properties of the rock mass above the deposit. The impacts of subsidence are potentially severe in terms of damage to surface utility lines and structures, changes in surface-water and ground-water conditions, and effects on vegetation and animals. Although subsidence cannot be eliminated, it can be reduced or controlled in areas where deformation of the ground surface would produce dangerous or costly effects. Subsidence prediction is highly developed in Europe where there are comparatively uniform mining conditions and a long history of field measurements. Much of this mining has been carried out beneath crowded urban and industrial areas where accurate predictions have facilitated use of the surface and reduced undesirable impacts. Concerted efforts to understand subsidence processes in the United States are recent. Empirical methods of subsidence analysis and prediction based on local conditions seem better suited to the current state of knowledge of the varied geologic and topographic conditions in domestic coal mining regions than do theoretical/mathematical approaches. In order to develop broadly applicable subsidence prediction methods and models for the United States, more information is needed on magnitude and timing of ground movements and geologic properties.

  4. Effectiveness of qPCR permutations, internal controls and dilution as means for minimizing the impact of inhibition while measuring Enterococcus in environmental waters.

    PubMed

    Cao, Y; Griffith, J F; Dorevitch, S; Weisberg, S B

    2012-07-01

      Draft criteria for the optional use of qPCR for recreational water quality monitoring have been published in the United States. One concern is that inhibition of the qPCR assay can lead to false-negative results and potentially inadequate public health protection. We evaluate the effectiveness of strategies for minimizing the impact of inhibition.   Five qPCR method permutations for measuring Enterococcus were challenged with 133 potentially inhibitory fresh and marine water samples. Serial dilutions were conducted to assess Enterococcus target assay inhibition, to which inhibition identified using four internal controls (IC) was compared. The frequency and magnitude of inhibition varied considerably among qPCR methods, with the permutation using an environmental master mix performing substantially better. Fivefold dilution was also effective at reducing inhibition in most samples (>78%). ICs were variable and somewhat ineffective, with 54-85% agreement between ICs and serial dilution.   The current IC methods appear to not accurately predict Enterococcus inhibition and should be used with caution; fivefold dilution and the use of reagents designed for environmental sample analysis (i.e. more robust qPCR chemistry) may be preferable.   Suitable approaches for defining, detecting and reducing inhibition will improve implementation of qPCR for water monitoring. © 2012 The Authors. Journal of Applied Microbiology © 2012 The Society for Applied Microbiology.

  5. 77 FR 49451 - Agency Information Collection Activities: Consideration of Deferred Action for Childhood Arrivals...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-16

    ...-NEW] Agency Information Collection Activities: Consideration of Deferred Action for Childhood Arrivals... Deferred Action for Childhood Arrivals, 1615-NEW'' in the subject box. Regardless of the method used for... collection. (2) Title of the Form/Collection: Consideration of Deferred Action for Childhood Arrivals. (3...

  6. Afterword: Considerations for Future Practice of Assessment and Accountability

    ERIC Educational Resources Information Center

    Bresciani, Marilee J.

    2013-01-01

    This afterword offers challenges and considerations as the assessment movement continues to develop. The author offers some simple considerations for readers to ponder as they advance their evidence-based decision making processes, and encourages others to use these methods within the context of recent neuroscientific evidence that learning and…

  7. Detection of alpha-fetoprotein in magnetic immunoassay of thin channels using biofunctional nanoparticles

    NASA Astrophysics Data System (ADS)

    Tsai, H. Y.; Gao, B. Z.; Yang, S. F.; Li, C. S.; Fuh, C. Bor

    2014-01-01

    This paper presents the use of fluorescent biofunctional nanoparticles (10-30 nm) to detect alpha-fetoprotein (AFP) in a thin-channel magnetic immunoassay. We used an AFP model biomarker and s-shaped deposition zones to test the proposed detection method. The results show that the detection using fluorescent biofunctional nanoparticle has a higher throughput than that of functional microparticle used in previous experiments on affinity reactions. The proposed method takes about 3 min (versus 150 min of previous method) to detect 100 samples. The proposed method is useful for screening biomarkers in clinical applications, and can reduce the run time for sandwich immunoassays to less than 20 min. The detection limits (0.06 pg/ml) and linear ranges (0.068 pg/ml-0.68 ng/ml) of AFP using fluorescent biofunctional nanoparticles are the same as those of using functional microparticles within experimental errors. This detection limit is substantially lower and the linear range is considerably wider than those of enzyme-linked immunosorbent assay (ELISA) and other methods in sandwich immunoassay methods. The differences between this method and an ELISA in AFP measurements of serum samples were less than 12 %. The proposed method provides simple, fast, and sensitive detection with a high throughput for biomarkers.

  8. Flight-Test Evaluation of Flutter-Prediction Methods

    NASA Technical Reports Server (NTRS)

    Lind, RIck; Brenner, Marty

    2003-01-01

    The flight-test community routinely spends considerable time and money to determine a range of flight conditions, called a flight envelope, within which an aircraft is safe to fly. The cost of determining a flight envelope could be greatly reduced if there were a method of safely and accurately predicting the speed associated with the onset of an instability called flutter. Several methods have been developed with the goal of predicting flutter speeds to improve the efficiency of flight testing. These methods include (1) data-based methods, in which one relies entirely on information obtained from the flight tests and (2) model-based approaches, in which one relies on a combination of flight data and theoretical models. The data-driven methods include one based on extrapolation of damping trends, one that involves an envelope function, one that involves the Zimmerman-Weissenburger flutter margin, and one that involves a discrete-time auto-regressive model. An example of a model-based approach is that of the flutterometer. These methods have all been shown to be theoretically valid and have been demonstrated on simple test cases; however, until now, they have not been thoroughly evaluated in flight tests. An experimental apparatus called the Aerostructures Test Wing (ATW) was developed to test these prediction methods.

  9. Improvement of Microtremor Data Filtering and Processing Methods Used in Determining the Fundamental Frequency of Urban Areas

    NASA Astrophysics Data System (ADS)

    Mousavi Anzehaee, Mohammad; Adib, Ahmad; Heydarzadeh, Kobra

    2015-10-01

    The manner of microtremor data collection and filtering operation and also the method used for processing have a considerable effect on the accuracy of estimation of dynamic soil parameters. In this paper, running variance method was used to improve the automatic detection of data sections infected by local perturbations. In this method, the microtremor data running variance is computed using a sliding window. Then the obtained signal is used to remove the ranges of data affected by perturbations from the original data. Additionally, to determinate the fundamental frequency of a site, this study has proposed a statistical characteristics-based method. Actually, statistical characteristics, such as the probability density graph and the average and the standard deviation of all the frequencies corresponding to the maximum peaks in the H/ V spectra of all data windows, are used to differentiate the real peaks from the false peaks resulting from perturbations. The methods have been applied to the data recorded for the City of Meybod in central Iran. Experimental results show that the applied methods are able to successfully reduce the effects of extensive local perturbations on microtremor data and eventually to estimate the fundamental frequency more accurately compared to other common methods.

  10. Can treatment and disposal costs be reduced through metal recovery?

    USGS Publications Warehouse

    Smith, Kathleen S.; Figueroa, Linda; Plumlee, Geoffrey S.

    2015-01-01

    This paper describes a framework to conduct a “metal-recovery feasibility assessment” for mining influenced water (MIW) and associated treatment sludge. There are multiple considerations in such a determination, including the geologic/geochemical feasibility, market feasibility, technical feasibility, economic feasibility, and administrative feasibility. Each of these considerations needs to be evaluated to determine the practicality of metal recovery from a particular MIW.

  11. 77 FR 47677 - Duke Energy Carolinas, LLC, McGuire Nuclear Station, Units 1 and 2, Notice of Consideration of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-09

    ... Facility Operating License, Proposed No Significant Hazards Consideration Determination, and Opportunity... following methods: Federal Rulemaking Web site: Go to http://www.regulations.gov and search for Docket ID... related to this document by any of the following methods: Federal Rulemaking Web Site: Go to http://www...

  12. Nonlinear Reduced-Order Simulation Using An Experimentally Guided Modal Basis

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2012-01-01

    A procedure is developed for using nonlinear experimental response data to guide the modal basis selection in a nonlinear reduced-order simulation. The procedure entails using nonlinear acceleration response data to first identify proper orthogonal modes. Special consideration is given to cases in which some of the desired response data is unavailable. Bases consisting of linear normal modes are then selected to best represent the experimentally determined transverse proper orthogonal modes and either experimentally determined inplane proper orthogonal modes or the special case of numerically computed in-plane companions. The bases are subsequently used in nonlinear modal reduction and dynamic response simulations. The experimental data used in this work is simulated to allow some practical considerations, such as the availability of in-plane response data and non-idealized test conditions, to be explored. Comparisons of the nonlinear reduced-order simulations are made with the surrogate experimental data to demonstrate the effectiveness of the approach.

  13. Economic and care considerations of Marfan syndrome.

    PubMed

    Blankart, Carl Rudolf; Milstein, Ricarda; Rybczynski, Meike; Schüler, Helke; von Kodolitsch, Yskert

    2016-10-01

    Marfan syndrome is a rare multisystem disease of the connective tissue, which affects multiple organ systems. advances in healthcare have doubled the life-expectancy of patients over the past three decades. to date, there is no comprehensive review that consolidates economic considerations and care for marfan patients. Areas covered: Present research suggests that there may be a link between treatment pattern, disease progression and economic costs of Marfan syndrome. It indicates that an early detection of the disease and preventive interventions achieve a dual aim. From a patient perspective, it may reduce the amount of emergency surgery or intervention, and inpatient stays. In addition, it slows disease progression, lowers lifestyle restrictions, reduces psychological stress, and improves health-related quality of life. Expert commentary: Early detection and preventive measures are likely to achieve a dual aim by simultaneously containing costs and reducing the number and length of inpatient stays.

  14. Boiler burden reduced at Bedford site.

    PubMed

    Horsley, Chris

    2011-10-01

    With the NHS aiming to reduce its 2007 carbon footprint by 10% by 2015, Chris Horsley, managing director of Babcock Wanson UK, a provider of industrial boilers and burners, thermal oxidisers, air treatment, water treatment, and associated services, looks at how one NHS Trust has approached the challenge, and considerably reduced its carbon emissions, by refurbishing its boiler house and moving from oil to gas-fired steam generation.

  15. Combining Information on Multiple Detection Techniques to Estimate the Effect of Patent Foramen Ovale on Susceptibility to Decompression Illness

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.; Foster, Philip P.

    2001-01-01

    The assembly and the maintenance of the International Space Station is expected to require hundreds of extravehicular excursions (EVA's) in the next 10 years. During an EVA, in order to allow movement and bending of limbs, spacesuit pressures are reduced to about 4.3 psi. as compared with about 14.7 psi. for normal atmospheric pressure at sea level. However, the exposure of astronauts to reduced pressures in spacesuits, is conducive to fonnation and growth of gas bubbles within venous blood or tissues, which could cause decompression illness (DCI), a pathology best known to occur among deep-sea divers when they return to the surface. To reduce the risk of DCI, astronauts adjust to the reduced pressure in stages for a prolonged time known as a "pre-breathe" period prior to their extravehicular activity. Despite the use of pre-breathe protocols, an increased risk of DCI can arise for about 25% of humans who have a small hole, known as a patent foramen ovale (PFO), between two chambers of the heart. The atrial septum's fossa oval is, an embryological remnant of a flap between the septae primum and secundum allows fetal right atrial blood to pass into the left atrium, and usually closes after birth (Hagen, et al,. 1984). If fusion does not occur, a valve-like opening, the foramen ovale persists between the two atria. It has been suggested that astronauts with PFO's might be at greater risk of stroke or other serious neurological DCI because bubbles from a venous site may traverse a PFO, travel to the aorta and then enter the cerebral circulatory system causing a stroke (Figure 1). Astronauts are not now screened for PFO's, however consideration is being given to doing so. Here, we study three main methods abbreviated here as "ITE", "TCD" and "TEE", for detecting PFO's in living subjects. All involve the introduction of bubbles into a vein, immediately after which a sensory probe attempts to detect the bubbles in systemic circulation. Presence of the injected bubbles in the systemic circulation is indicative of a PFO. More detailed descriptions are given after the explanation of PFO's under Figure I. Even if a true PFO affects the risk of DCI, there remains a question of how effective screening would be if the detection method has errors of omission and/or commission. Of the three methods studied here, TEE is the "gold standard", matching autopsy results with near-perfect sensitivity and specificity (Schneider, et al. , 1996). However TEE is also the most difficult method to implement, requiring an internal esophagal probe, and is therefore not widely used. Currently, the easiest to use and most common PFO detection method is TTE, which uses an external chest probe. This method has a specificity of near 100%, but suffers from a low sensitivity rate (about 30%). More recently, TCD has been developed, which uses ultrasound probes to detect the presence of bubbles in cerebral arteries. Studies indicate that TCD is quite effective, having a sensitivity of about 91% and a specificity of about 93% (Droste, et al., 1999) when applied correctly, however implementation is difficult and requires considerable training.

  16. Development of insect life tables: comparison of two demographic methods of Delia antiqua (Diptera: Anthomyiidae) on different hosts.

    PubMed

    Ning, Shuoying; Zhang, Wenchao; Sun, Yan; Feng, Jinian

    2017-07-06

    In this study, we first construct an age-stage, two-sex life table for onion maggot, Delia antiqua, grown on three host plants: onion, scallion, and garlic. We found that onion is the optimal host for this species and populations grown on onion have maximum fecundity, longest adult longevity and reproduction period, and the shortest immature developmental time. In contrast, the fecundity on other hosts was lower, particularly on garlic, but these crops can also serve as important secondary hosts for this pest. These data will be useful to the growers to develop specific integrated management programs for each of hosts. We also compared the demographic analyses of using individually-reared and group-reared methods. These two methods provided similar accurate outcomes for estimating insect population dynamics for this species. However, for gregarious species, using the individually-reared method to construct insect life tables produces inaccurate results, and researchers must use group-reared method for life table calculations. When studying large groups of insect, group-reared demographic analysis for age-stage, two-sex life table can also simplify statistical analysis, save considerable labor, and reduce experimental errors.

  17. A parallelization method for time periodic steady state in simulation of radio frequency sheath dynamics

    NASA Astrophysics Data System (ADS)

    Kwon, Deuk-Chul; Shin, Sung-Sik; Yu, Dong-Hun

    2017-10-01

    In order to reduce the computing time in simulation of radio frequency (rf) plasma sources, various numerical schemes were developed. It is well known that the upwind, exponential, and power-law schemes can efficiently overcome the limitation on the grid size for fluid transport simulations of high density plasma discharges. Also, the semi-implicit method is a well-known numerical scheme to overcome on the simulation time step. However, despite remarkable advances in numerical techniques and computing power over the last few decades, efficient multi-dimensional modeling of low temperature plasma discharges has remained a considerable challenge. In particular, there was a difficulty on parallelization in time for the time periodic steady state problems such as capacitively coupled plasma discharges and rf sheath dynamics because values of plasma parameters in previous time step are used to calculate new values each time step. Therefore, we present a parallelization method for the time periodic steady state problems by using period-slices. In order to evaluate the efficiency of the developed method, one-dimensional fluid simulations are conducted for describing rf sheath dynamics. The result shows that speedup can be achieved by using a multithreading method.

  18. Assimilation of river altimetry data for effective bed elevation and roughness coefficient

    NASA Astrophysics Data System (ADS)

    Brêda, João Paulo L. F.; Paiva, Rodrigo C. D.; Bravo, Juan Martin; Passaia, Otávio

    2017-04-01

    Hydrodynamic models of large rivers are important prediction tools of river discharge, height and floods. However, these techniques still carry considerable errors; part of them related to parameters uncertainties related to river bathymetry and roughness coefficient. Data from recent spatial altimetry missions offers an opportunity to reduce parameters uncertainty through inverse methods. This study aims to develop and access different methods of altimetry data assimilation to improve river bottom levels and Manning roughness estimations in a 1-D hydrodynamic model. The case study was a 1,100 km reach of the Madeira River, a tributary of the Amazon. The tested assimilation methods are direct insertion, linear interpolation, SCE-UA global optimization algorithm and a Kalman Filter adaptation. The Kalman Filter method is composed by new physically based covariance functions developed from steady-flow and backwater equations. It is accessed the benefits of altimetry missions with different spatio-temporal resolutions, such as ICESAT-1, Envisat and Jason 2. Level time series of 5 gauging stations and 5 GPS river height profiles are used to assess and validate the assimilation methods. Finally, the potential of future missions are discussed, such as ICESAT-2 and SWOT satellites.

  19. Low-dose X-ray computed tomography image reconstruction with a combined low-mAs and sparse-view protocol.

    PubMed

    Gao, Yang; Bian, Zhaoying; Huang, Jing; Zhang, Yunwan; Niu, Shanzhou; Feng, Qianjin; Chen, Wufan; Liang, Zhengrong; Ma, Jianhua

    2014-06-16

    To realize low-dose imaging in X-ray computed tomography (CT) examination, lowering milliampere-seconds (low-mAs) or reducing the required number of projection views (sparse-view) per rotation around the body has been widely studied as an easy and effective approach. In this study, we are focusing on low-dose CT image reconstruction from the sinograms acquired with a combined low-mAs and sparse-view protocol and propose a two-step image reconstruction strategy. Specifically, to suppress significant statistical noise in the noisy and insufficient sinograms, an adaptive sinogram restoration (ASR) method is first proposed with consideration of the statistical property of sinogram data, and then to further acquire a high-quality image, a total variation based projection onto convex sets (TV-POCS) method is adopted with a slight modification. For simplicity, the present reconstruction strategy was termed as "ASR-TV-POCS." To evaluate the present ASR-TV-POCS method, both qualitative and quantitative studies were performed on a physical phantom. Experimental results have demonstrated that the present ASR-TV-POCS method can achieve promising gains over other existing methods in terms of the noise reduction, contrast-to-noise ratio, and edge detail preservation.

  20. Investigation on the reproduction performance versus acoustic contrast control in sound field synthesis.

    PubMed

    Bai, Mingsian R; Wen, Jheng-Ciang; Hsu, Hoshen; Hua, Yi-Hsin; Hsieh, Yu-Hao

    2014-10-01

    A sound reconstruction system is proposed for audio reproduction with extended sweet spot and reduced reflections. An equivalent source method (ESM)-based sound field synthesis (SFS) approach, with the aid of dark zone minimization is adopted in the study. Conventional SFS that is based on the free-field assumption suffers from synthesis error due to boundary reflections. To tackle the problem, the proposed system utilizes convex optimization in designing array filters with both reproduction performance and acoustic contrast taken into consideration. Control points are deployed in the dark zone to minimize the reflections from the walls. Two approaches are employed to constrain the pressure and velocity in the dark zone. Pressure matching error (PME) and acoustic contrast (AC) are used as performance measures in simulations and experiments for a rectangular loudspeaker array. Perceptual Evaluation of Audio Quality (PEAQ) is also used to assess the audio reproduction quality. The results show that the pressure-constrained (PC) method yields better acoustic contrast, but poorer reproduction performance than the pressure-velocity constrained (PVC) method. A subjective listening test also indicates that the PVC method is the preferred method in a live room.

  1. Simplifier: a web tool to eliminate redundant NGS contigs.

    PubMed

    Ramos, Rommel Thiago Jucá; Carneiro, Adriana Ribeiro; Azevedo, Vasco; Schneider, Maria Paula; Barh, Debmalya; Silva, Artur

    2012-01-01

    Modern genomic sequencing technologies produce a large amount of data with reduced cost per base; however, this data consists of short reads. This reduction in the size of the reads, compared to those obtained with previous methodologies, presents new challenges, including a need for efficient algorithms for the assembly of genomes from short reads and for resolving repetitions. Additionally after abinitio assembly, curation of the hundreds or thousands of contigs generated by assemblers demands considerable time and computational resources. We developed Simplifier, a stand-alone software that selectively eliminates redundant sequences from the collection of contigs generated by ab initio assembly of genomes. Application of Simplifier to data generated by assembly of the genome of Corynebacterium pseudotuberculosis strain 258 reduced the number of contigs generated by ab initio methods from 8,004 to 5,272, a reduction of 34.14%; in addition, N50 increased from 1 kb to 1.5 kb. Processing the contigs of Escherichia coli DH10B with Simplifier reduced the mate-paired library 17.47% and the fragment library 23.91%. Simplifier removed redundant sequences from datasets produced by assemblers, thereby reducing the effort required for finalization of genome assembly in tests with data from Prokaryotic organisms. Simplifier is available at http://www.genoma.ufpa.br/rramos/softwares/simplifier.xhtmlIt requires Sun jdk 6 or higher.

  2. America's Opioid Epidemic: Supply and Demand Considerations.

    PubMed

    Clark, David J; Schumacher, Mark A

    2017-11-01

    America is in the midst of an opioid epidemic characterized by aggressive prescribing practices, highly prevalent opioid misuse, and rising rates of prescription and illicit opioid overdose-related deaths. Medical and lay public sentiment have become more cautious with respect to prescription opioid use in the past few years, but a comprehensive strategy to reduce our reliance on prescription opioids is lacking. Addressing this epidemic through reductions in unnecessary access to these drugs while implementing measures to reduce demand will be important components of any comprehensive solution. Key supply-side measures include avoiding overprescribing, reducing diversion, and discouraging misuse through changes in drug formulations. Important demand-side measures center around educating patients and clinicians regarding the pitfalls of opioid overuse and methods to avoid unnecessary exposure to these drugs. Anesthesiologists, by virtue of their expertise in the use of these drugs and their position in guiding opioid use around the time of surgery, have important roles to play in reducing patient exposure to opioids and providing education about appropriate use. Aside from the many immediate steps that can be taken, clinical and basic research directed at understanding the interaction between pain and opioid misuse is critical to identifying the optimal use of these powerful pain relievers in clinical practice.

  3. Development and assessment of plant-based synthetic odor baits for surveillance and control of Malaria vectors

    USDA-ARS?s Scientific Manuscript database

    Recent malaria vector control measures have considerably reduced indoor biting mosquito populations. However, reducing the outdoor biting populations remains a challenge because of the unavailability of appropriate lures to achieve this. This study sought to test the efficacy of plant-based syntheti...

  4. Surface Heave Behaviour of Coir Geotextile Reinforced Sand Beds

    NASA Astrophysics Data System (ADS)

    Lal, Dharmesh; Sankar, N.; Chandrakaran, S.

    2017-06-01

    Soil reinforcement by natural fibers is one of the cheapest and attractive ground improvement techniques. Coir is the most abundant natural fiber available in India and due to its high lignin content; it has a larger life span than other natural fibers. It is widely used in India for erosion control purposes, but its use as a reinforcement material is rather limited. This study focuses on the use of coir geotextile as a reinforcement material to reduce surface heave phenomena occurring in shallow foundations. This paper presents the results of laboratory model tests carried out on square footings supported on coir geotextile reinforced sand beds. The influence of various parameters such as depth of reinforcement, length, and number of layers of reinforcement was studied. It was observed that surface heave is considerably reduced with the provision of geotextile. Heave reduction up to 98.7% can be obtained by the proposed method. Heave reduction is quantified by a non-dimensional parameter called heave reduction factor.

  5. Lightning protection guidelines and test data for adhesively bonded aircraft structures

    NASA Technical Reports Server (NTRS)

    Pryzby, J. E.; Plumer, J. A.

    1984-01-01

    The highly competitive marketplace and increasing cost of energy has motivated manufacturers of general aviation aircraft to utilize composite materials and metal-to-metal bonding in place of conventional fasteners and rivets to reduce weight, obtain smoother outside surfaces and reduce drag. The purpose of this program is protection of these new structures from hazardous lightning effects. The program began with a survey of advance-technology materials and fabrication methods under consideration for future designs. Sub-element specimens were subjected to simulated lightning voltages and currents. Measurements of bond line voltages, electrical sparking, and mechanical strength degradation were made to comprise a data base of electrical properties for new technology materials and basic structural configurations. The second hase of the program involved tests on full scale wing structures which contained integral fuel tanks and which were representative of examples of new technology structures and fuel systems. The purpose of these tests was to provide a comparison between full scale structural measurements and those obtained from the sub-element specimens.

  6. Ethanolic extract of Piper betle Linn. leaves reduces nociception via modulation of arachidonic acid pathway

    PubMed Central

    De, Soumita; Maroo, Niteeka; Saha, Piu; Hazra, Samik; Chatterjee, Mitali

    2013-01-01

    Objectives: The objective of this study was to evaluate the peripheral analgesic effect of Piper betle leaf extract (PBE) along with establishing its putative mechanism of action. Materials and Methods: Male Swiss albino mice after pre-treatment (1 h) with different doses of PBE were injected 0.8% (v/v) acetic acid i.p.; the onset and number of writhes were noted up to 15 min. To evaluate the mechanism of action, the murine peritoneal exudate was incubated with PBE for 1 h, followed by exposure to arachidonic acid (AA) and generation of reactive oxygen species (ROS) was measured by flow cytometry using 2’,7’-dichlorodihydrofluorescein diacetate. Results: PBE in a dose dependent manner significantly reduced acetic acid induced writhing response in mice (P < 0.001). In peritoneal exudates, PBE significantly inhibited AA induced generation of ROS, P < 0.01. Conclusions: The present study indicates that PBE has promising analgesic activity, worthy of future pharmacological consideration. PMID:24130383

  7. Revealing Optical Components

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Optical Vector Analyzer (OVA) 1550 significantly reduces the time and cost of testing sophisticated optical components. The technology grew from the research Luna Technologies' Dr. Mark Froggatt conducted on optical fiber strain measurement while working at Langley Research Center. Dr. Froggatt originally developed the technology for non- destructive evaluation testing at Langley. The new technique can provide 10,000 independent strain measurements while adding less than 10 grams to the weight of the vehicle. The OVA is capable of complete linear characterization of single-mode optical components used in high- bit-rate applications. The device can test most components over their full range in less than 30 seconds, compared to the more than 20 minutes required by other testing methods. The dramatically shortened measurement time results in increased efficiency in final acceptance tests of optical devices, and the comprehensive data produced by the instrument adds considerable value for component consumers. The device eliminates manufacturing bottlenecks, while reducing labor costs and wasted materials during production.

  8. Solar cell design for avoiding LILT degradation. [low intensity, low temperature

    NASA Technical Reports Server (NTRS)

    Stella, P. M.; Ctorry, G. T.

    1987-01-01

    Growing concerns about radioisotope thermoelectric generator (RTG) performance potential, cost, safety, and availability have renewed interest in utilizing photovoltaic energy conversion for future JPL interplanetary missions such as the Mariner Mark II set. Although lightweight solar array technology has advanced to the point where it would appear to provide an alternative power source, anomalous silicon cell curve shape degradation at conditions of low intensity and low temperature (LILT) severely restricts photovoltaic applications for missions beyond 3 AU solar distance. In order to extend photovoltaic applications to distances of 5 AU, ways to minimize the deleterious impact of LILT cell degradation were investigated. These investigations have ranged from consideration of individual cell selection for LILT behavior to the examination of methods for reducing or eliminating cell LILT degradation by modifying the cell processing. Use of a partial oxide barrier between the cell n+ contacts and the silicon has been shown to reduce significantly both the occurrence and magnitude of the LILT degradation.

  9. Metabolic Flux Analysis in Isotope Labeling Experiments Using the Adjoint Approach.

    PubMed

    Mottelet, Stephane; Gaullier, Gil; Sadaka, Georges

    2017-01-01

    Comprehension of metabolic pathways is considerably enhanced by metabolic flux analysis (MFA-ILE) in isotope labeling experiments. The balance equations are given by hundreds of algebraic (stationary MFA) or ordinary differential equations (nonstationary MFA), and reducing the number of operations is therefore a crucial part of reducing the computation cost. The main bottleneck for deterministic algorithms is the computation of derivatives, particularly for nonstationary MFA. In this article, we explain how the overall identification process may be speeded up by using the adjoint approach to compute the gradient of the residual sum of squares. The proposed approach shows significant improvements in terms of complexity and computation time when it is compared with the usual (direct) approach. Numerical results are obtained for the central metabolic pathways of Escherichia coli and are validated against reference software in the stationary case. The methods and algorithms described in this paper are included in the sysmetab software package distributed under an Open Source license at http://forge.scilab.org/index.php/p/sysmetab/.

  10. Damage Resistant Optical Glasses for High Power Lasers: A Continuing Glass Science and Technology Challenge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campbell, J H

    2002-08-28

    A major challenge in the development of optical glasses for high-power lasers is reducing or eliminating laser-induced damage to the interior (bulk) and the polished surface of the glass. Bulk laser damage in glass generally originates from inclusions. With the development of novel glass melting and forming processes it is now possible to make both fused silica and a suit of meta-phosphate laser glasses in large sizes ({approx}>0.5-lm diameter), free of inclusions and with high optical homogeneity ({approx} 10{sup -6}). Considerable attention also has been focused on improving the laser damage resistance to polished optical glass surfaces. Studies have shownmore » that laser-induced damage to surfaces grows exponentially with the number of shots when illuminated with nano-second pulses at 351-nm above a given fluence threshold. A new approach for reducing and eliminating laser-induced surface damage relies on a series of post-polishing treatment steps. This damage improvement method is briefly reviewed.« less

  11. Plasma-treated Langmuir-Blodgett reduced graphene oxide thin film for applications in biophotovoltaics

    NASA Astrophysics Data System (ADS)

    Ibrahim, Siti Aisyah; Jaafar, Muhammad Musoddiq; Ng, Fong-Lee; Phang, Siew-Moi; Kumar, G. Ghana; Majid, Wan Haliza Abd; Periasamy, Vengadesh

    2018-01-01

    The surface optimization and structural characteristics of Langmuir-Blodgett (LB) reduced graphene oxide thin (rGO) film treated by argon plasma treatment were studied. In this work, six times deposition of rGO was deposited on a clean glass substrate using the LB method. Plasma technique involving a variation of plasma power, i.e., 20, 60, 100 and 140 W was exposed to the LB-rGO thin films under argon ambience. The plasma treatment generally improves the wettability or hydrophilicity of the film surface compared to without treatment. Maximum wettability was observed at a plasma power of 20 W, while also increasing the adhesion of the rGO film with the glass substrate. The multilayer films fabricated were characterized by means of spectroscopic, structural and electrical studies. The treatment of rGO with argon plasma was found to have improved its biocompatibility, and thus its performance as an electrode for biophotovoltaic devices has been shown to be enhanced considerably.

  12. Printed interconnects for photovoltaic modules

    DOE PAGES

    Fields, J. D.; Pach, G.; Horowitz, K. A. W.; ...

    2016-10-21

    Film-based photovoltaic modules employ monolithic interconnects to minimize resistance loss and enhance module voltage via series connection. Conventional interconnect construction occurs sequentially, with a scribing step following deposition of the bottom electrode, a second scribe after deposition of absorber and intermediate layers, and a third following deposition of the top electrode. This method produces interconnect widths of about 300 µm, and the area comprised by interconnects within a module (generally about 3%) does not contribute to power generation. The present work reports on an increasingly popular strategy capable of reducing the interconnect width to less than 100 µm: printing interconnects.more » Cost modeling projects a savings of about $0.02/watt for CdTe module production through the use of printed interconnects, with savings coming from both reduced capital expense and increased module power output. Printed interconnect demonstrations with copper-indium-gallium-diselenide and cadmium-telluride solar cells show successful voltage addition and miniaturization down to 250 µm. As a result, material selection guidelines and considerations for commercialization are discussed.« less

  13. Purchase decision-making is modulated by vestibular stimulation.

    PubMed

    Preuss, Nora; Mast, Fred W; Hasler, Gregor

    2014-01-01

    Purchases are driven by consumers' product preferences and price considerations. Using caloric vestibular stimulation (CVS), we investigated the role of vestibular-affective circuits in purchase decision-making. CVS is an effective noninvasive brain stimulation method, which activates vestibular and overlapping emotional circuits (e.g., the insular cortex and the anterior cingulate cortex (ACC)). Subjects were exposed to CVS and sham stimulation while they performed two purchase decision-making tasks. In Experiment 1 subjects had to decide whether to purchase or not. CVS significantly reduced probability of buying a product. In Experiment 2 subjects had to rate desirability of the products and willingness to pay (WTP) while they were exposed to CVS and sham stimulation. CVS modulated desirability of the products but not WTP. The results suggest that CVS interfered with emotional circuits and thus attenuated the pleasant and rewarding effect of acquisition, which in turn reduced purchase probability. The present findings contribute to the rapidly growing literature on the neural basis of purchase decision-making.

  14. Purchase decision-making is modulated by vestibular stimulation

    PubMed Central

    Preuss, Nora; Mast, Fred W.; Hasler, Gregor

    2014-01-01

    Purchases are driven by consumers’ product preferences and price considerations. Using caloric vestibular stimulation (CVS), we investigated the role of vestibular-affective circuits in purchase decision-making. CVS is an effective noninvasive brain stimulation method, which activates vestibular and overlapping emotional circuits (e.g., the insular cortex and the anterior cingulate cortex (ACC)). Subjects were exposed to CVS and sham stimulation while they performed two purchase decision-making tasks. In Experiment 1 subjects had to decide whether to purchase or not. CVS significantly reduced probability of buying a product. In Experiment 2 subjects had to rate desirability of the products and willingness to pay (WTP) while they were exposed to CVS and sham stimulation. CVS modulated desirability of the products but not WTP. The results suggest that CVS interfered with emotional circuits and thus attenuated the pleasant and rewarding effect of acquisition, which in turn reduced purchase probability. The present findings contribute to the rapidly growing literature on the neural basis of purchase decision-making. PMID:24600365

  15. An efficient management system for wireless sensor networks.

    PubMed

    Ma, Yi-Wei; Chen, Jiann-Liang; Huang, Yueh-Min; Lee, Mei-Yu

    2010-01-01

    Wireless sensor networks have garnered considerable attention recently. Networks typically have many sensor nodes, and are used in commercial, medical, scientific, and military applications for sensing and monitoring the physical world. Many researchers have attempted to improve wireless sensor network management efficiency. A Simple Network Management Protocol (SNMP)-based sensor network management system was developed that is a convenient and effective way for managers to monitor and control sensor network operations. This paper proposes a novel WSNManagement system that can show the connections stated of relationships among sensor nodes and can be used for monitoring, collecting, and analyzing information obtained by wireless sensor networks. The proposed network management system uses collected information for system configuration. The function of performance analysis facilitates convenient management of sensors. Experimental results show that the proposed method enhances the alive rate of an overall sensor node system, reduces the packet lost rate by roughly 5%, and reduces delay time by roughly 0.2 seconds. Performance analysis demonstrates that the proposed system is effective for wireless sensor network management.

  16. Simulation and Analysis of Three-Phase Rectifiers for Aerospace Power Applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Birchenough, Arthur G.

    2004-01-01

    Due to the nature of planned planetary missions, fairly large advanced power systems are required for the spacecraft. These future high power spacecrafts are expected to use dynamic power conversion systems incorporating high speed alternators as three-phase AC electrical power source. One of the early design considerations in such systems is the type of rectification to be used with the AC source for DC user loads. This paper address the issues involved with two different rectification methods, namely the conventional six and twelve pulses. Two circuit configurations which involved parallel combinations of the six and twelve-pulse rectifiers were selected for the simulation. The rectifier s input and output power waveforms will be thoroughly examined through simulations. The effects of the parasitic load for power balancing and filter components for reducing the ripple voltage at the DC loads are also included in the analysis. Details of the simulation circuits, simulation results, and design examples for reducing risk from damaging of spacecraft engines will be presented and discussed.

  17. Fast prediction and evaluation of eccentric inspirals using reduced-order models

    NASA Astrophysics Data System (ADS)

    Barta, Dániel; Vasúth, Mátyás

    2018-06-01

    A large number of theoretically predicted waveforms are required by matched-filtering searches for the gravitational-wave signals produced by compact binary coalescence. In order to substantially alleviate the computational burden in gravitational-wave searches and parameter estimation without degrading the signal detectability, we propose a novel reduced-order-model (ROM) approach with applications to adiabatic 3PN-accurate inspiral waveforms of nonspinning sources that evolve on either highly or slightly eccentric orbits. We provide a singular-value decomposition-based reduced-basis method in the frequency domain to generate reduced-order approximations of any gravitational waves with acceptable accuracy and precision within the parameter range of the model. We construct efficient reduced bases comprised of a relatively small number of the most relevant waveforms over three-dimensional parameter-space covered by the template bank (total mass 2.15 M⊙≤M ≤215 M⊙ , mass ratio 0.01 ≤q ≤1 , and initial orbital eccentricity 0 ≤e0≤0.95 ). The ROM is designed to predict signals in the frequency band from 10 Hz to 2 kHz for aLIGO and aVirgo design sensitivity. Beside moderating the data reduction, finer sampling of fiducial templates improves the accuracy of surrogates. Considerable increase in the speedup from several hundreds to thousands can be achieved by evaluating surrogates for low-mass systems especially when combined with high-eccentricity.

  18. 31 CFR Appendix to Part 351 - Tax Considerations

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... 351, App. A Appendix to Part 351—Tax Considerations 1. What are some general tax considerations... any other obligations purchased on a discount basis. (b) Changing methods. If you use the cash basis... primary owner. (d) The purchase of a Series EE savings bond as a gift may have gift tax consequences. ...

  19. 31 CFR Appendix to Part 351 - Tax Considerations

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    .... 351, App. A Appendix to Part 351—Tax Considerations 1. What are some general tax considerations... any other obligations purchased on a discount basis. (b) Changing methods. If you use the cash basis... primary owner. (d) The purchase of a Series EE savings bond as a gift may have gift tax consequences. ...

  20. Design Considerations for mHealth Programs Targeting Smokers Not Yet Ready to Quit: Results of a Sequential Mixed-Methods Study.

    PubMed

    McClure, Jennifer B; Heffner, Jaimee; Hohl, Sarah; Klasnja, Predrag; Catz, Sheryl L

    2017-03-10

    Mobile health (mHealth) smoking cessation programs are typically designed for smokers who are ready to quit smoking. In contrast, most smokers want to quit someday but are not yet ready to quit. If mHealth apps were designed for these smokers, they could potentially encourage and assist more people to quit smoking. No prior studies have specifically examined the design considerations of mHealth apps targeting smokers who are not yet ready to quit. To inform the user-centered design of mHealth apps for smokers who were not yet ready to quit by assessing (1) whether these smokers were interested in using mHealth tools to change their smoking behavior; (2) their preferred features, functionality, and content of mHealth programs addressing smoking; and (3) considerations for marketing or distributing these programs to promote their uptake. We conducted a sequential exploratory, mixed-methods study. Qualitative interviews (phase 1, n=15) were completed with a demographically diverse group of smokers who were smartphone owners and wanted to quit smoking someday, but not yet. Findings informed a Web-based survey of smokers from across the United States (phase 2, n=116). Data were collected from April to September, 2016. Findings confirmed that although smokers not yet ready to quit are not actively seeking treatment or using cessation apps, most would be interested in using these programs to help them reduce or change their smoking behavior. Among phase 2 survey respondents, the app features, functions, and content rated most highly were (1) security of personal information; (2) the ability to track smoking, spending, and savings; (3) content that adaptively changes with one's needs; (4) the ability to request support as needed; (5) the ability to earn and redeem awards for program use; (6) guidance on how to quit smoking; and (7) content specifically addressing management of nicotine withdrawal, stress, depression, and anxiety. Results generally did not vary by stage of change for quitting smoking (precontemplation vs contemplation). The least popular feature was the ability to share progress via social media. Relevant to future marketing or distribution considerations, smokers were price-sensitive and valued empirically validated programs. Program source, expert recommendations, and user ratings were also important considerations. Smokers who are not yet ready to quit represent an important target group for intervention. Study findings suggest that many of these individuals are receptive to using mHealth tools to reduce or quit smoking, despite not having made a commitment to quit yet. The preferences for specific mHealth intervention features, functionality, and content outlined in this paper can aid addiction treatment experts, design specialists, and software developers interested in creating engaging interventions for smokers who want to quit in the future but are not yet committed to this important health goal. ©Jennifer B McClure, Jaimee Heffner, Sarah Hohl, Predrag Klasnja, Sheryl L Catz. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 10.03.2017.

  1. Multistep Lattice-Voxel method utilizing lattice function for Monte-Carlo treatment planning with pixel based voxel model.

    PubMed

    Kumada, H; Saito, K; Nakamura, T; Sakae, T; Sakurai, H; Matsumura, A; Ono, K

    2011-12-01

    Treatment planning for boron neutron capture therapy generally utilizes Monte-Carlo methods for calculation of the dose distribution. The new treatment planning system JCDS-FX employs the multi-purpose Monte-Carlo code PHITS to calculate the dose distribution. JCDS-FX allows to build a precise voxel model consisting of pixel based voxel cells in the scale of 0.4×0.4×2.0 mm(3) voxel in order to perform high-accuracy dose estimation, e.g. for the purpose of calculating the dose distribution in a human body. However, the miniaturization of the voxel size increases calculation time considerably. The aim of this study is to investigate sophisticated modeling methods which can perform Monte-Carlo calculations for human geometry efficiently. Thus, we devised a new voxel modeling method "Multistep Lattice-Voxel method," which can configure a voxel model that combines different voxel sizes by utilizing the lattice function over and over. To verify the performance of the calculation with the modeling method, several calculations for human geometry were carried out. The results demonstrated that the Multistep Lattice-Voxel method enabled the precise voxel model to reduce calculation time substantially while keeping the high-accuracy of dose estimation. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty

    NASA Astrophysics Data System (ADS)

    Madani, Kaveh; Lund, Jay R.

    2011-05-01

    Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.

  3. Superfast maximum-likelihood reconstruction for quantum tomography

    NASA Astrophysics Data System (ADS)

    Shang, Jiangwei; Zhang, Zhengyun; Ng, Hui Khoon

    2017-06-01

    Conventional methods for computing maximum-likelihood estimators (MLE) often converge slowly in practical situations, leading to a search for simplifying methods that rely on additional assumptions for their validity. In this work, we provide a fast and reliable algorithm for maximum-likelihood reconstruction that avoids this slow convergence. Our method utilizes the state-of-the-art convex optimization scheme, an accelerated projected-gradient method, that allows one to accommodate the quantum nature of the problem in a different way than in the standard methods. We demonstrate the power of our approach by comparing its performance with other algorithms for n -qubit state tomography. In particular, an eight-qubit situation that purportedly took weeks of computation time in 2005 can now be completed in under a minute for a single set of data, with far higher accuracy than previously possible. This refutes the common claim that MLE reconstruction is slow and reduces the need for alternative methods that often come with difficult-to-verify assumptions. In fact, recent methods assuming Gaussian statistics or relying on compressed sensing ideas are demonstrably inapplicable for the situation under consideration here. Our algorithm can be applied to general optimization problems over the quantum state space; the philosophy of projected gradients can further be utilized for optimization contexts with general constraints.

  4. Comparison of electronic data capture (EDC) with the standard data capture method for clinical trial data.

    PubMed

    Walther, Brigitte; Hossin, Safayet; Townend, John; Abernethy, Neil; Parker, David; Jeffries, David

    2011-01-01

    Traditionally, clinical research studies rely on collecting data with case report forms, which are subsequently entered into a database to create electronic records. Although well established, this method is time-consuming and error-prone. This study compares four electronic data capture (EDC) methods with the conventional approach with respect to duration of data capture and accuracy. It was performed in a West African setting, where clinical trials involve data collection from urban, rural and often remote locations. Three types of commonly available EDC tools were assessed in face-to-face interviews; netbook, PDA, and tablet PC. EDC performance during telephone interviews via mobile phone was evaluated as a fourth method. The Graeco Latin square study design allowed comparison of all four methods to standard paper-based recording followed by data double entry while controlling simultaneously for possible confounding factors such as interview order, interviewer and interviewee. Over a study period of three weeks the error rates decreased considerably for all EDC methods. In the last week of the study the data accuracy for the netbook (5.1%, CI95%: 3.5-7.2%) and the tablet PC (5.2%, CI95%: 3.7-7.4%) was not significantly different from the accuracy of the conventional paper-based method (3.6%, CI95%: 2.2-5.5%), but error rates for the PDA (7.9%, CI95%: 6.0-10.5%) and telephone (6.3%, CI95% 4.6-8.6%) remained significantly higher. While EDC-interviews take slightly longer, data become readily available after download, making EDC more time effective. Free text and date fields were associated with higher error rates than numerical, single select and skip fields. EDC solutions have the potential to produce similar data accuracy compared to paper-based methods. Given the considerable reduction in the time from data collection to database lock, EDC holds the promise to reduce research-associated costs. However, the successful implementation of EDC requires adjustment of work processes and reallocation of resources.

  5. Comparison of Electronic Data Capture (EDC) with the Standard Data Capture Method for Clinical Trial Data

    PubMed Central

    Walther, Brigitte; Hossin, Safayet; Townend, John; Abernethy, Neil; Parker, David; Jeffries, David

    2011-01-01

    Background Traditionally, clinical research studies rely on collecting data with case report forms, which are subsequently entered into a database to create electronic records. Although well established, this method is time-consuming and error-prone. This study compares four electronic data capture (EDC) methods with the conventional approach with respect to duration of data capture and accuracy. It was performed in a West African setting, where clinical trials involve data collection from urban, rural and often remote locations. Methodology/Principal Findings Three types of commonly available EDC tools were assessed in face-to-face interviews; netbook, PDA, and tablet PC. EDC performance during telephone interviews via mobile phone was evaluated as a fourth method. The Graeco Latin square study design allowed comparison of all four methods to standard paper-based recording followed by data double entry while controlling simultaneously for possible confounding factors such as interview order, interviewer and interviewee. Over a study period of three weeks the error rates decreased considerably for all EDC methods. In the last week of the study the data accuracy for the netbook (5.1%, CI95%: 3.5–7.2%) and the tablet PC (5.2%, CI95%: 3.7–7.4%) was not significantly different from the accuracy of the conventional paper-based method (3.6%, CI95%: 2.2–5.5%), but error rates for the PDA (7.9%, CI95%: 6.0–10.5%) and telephone (6.3%, CI95% 4.6–8.6%) remained significantly higher. While EDC-interviews take slightly longer, data become readily available after download, making EDC more time effective. Free text and date fields were associated with higher error rates than numerical, single select and skip fields. Conclusions EDC solutions have the potential to produce similar data accuracy compared to paper-based methods. Given the considerable reduction in the time from data collection to database lock, EDC holds the promise to reduce research-associated costs. However, the successful implementation of EDC requires adjustment of work processes and reallocation of resources. PMID:21966505

  6. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  7. Twelve tips for getting started using mixed methods in medical education research.

    PubMed

    Lavelle, Ellen; Vuk, Jasna; Barber, Carolyn

    2013-04-01

    Mixed methods research, which is gaining popularity in medical education, provides a new and comprehensive approach for addressing teaching, learning, and evaluation issues in the field. The aim of this article is to provide medical education researchers with 12 tips, based on consideration of current literature in the health professions and in educational research, for conducting and disseminating mixed methods research. Engaging in mixed methods research requires consideration of several major components: the mixed methods paradigm, types of problems, mixed method designs, collaboration, and developing or extending theory. Mixed methods is an ideal tool for addressing a full range of problems in medical education to include development of theory and improving practice.

  8. Phosphorus speciation by coupled HPLC-ICPMS: low level determination of reduced phosphorus in natural materials

    NASA Astrophysics Data System (ADS)

    Atlas, Zachary; Pasek, Matthew; Sampson, Jacqueline

    2015-04-01

    Phosphorus is a geologically important minor element in the Earth's crust commonly found as relatively insoluble apatite. This constraint causes phosphorus to be a key limiting nutrient in biologic processes. Despite this, phosphorus plays a direct role in the formation of DNA, RNA and other cellular materials. Recent works suggest that since reduced phosphorus is considerably more soluble than oxidized phosphorus that it was integrally involved in the development of life on the early Earth and may continue to play a role in biologic productivity to this day. This work examines a new method for quantification and identification of reduced phosphorus as well as applications to the speciation of organo-phosphates separated by coupled HPLC - ICP-MS. We show that reduced phosphorus species (P1+, P3+ and P5+) are cleanly separated in the HPLC and coupled with the ICPMS reaction cell, using oxygen as a reaction gas to effectively convert elemental P to P-O. Analysis at M/Z= 47 producing lower background and flatter baseline chromatography than analyses performed at M/Z = 31. Results suggest very low detection limits (0.05 μM) for P species analyzed as P-O. Additionally we show that this technique has potential to speciate at least 5 other forms of phosphorus compounds. We verified the efficacy of method on numerous materials including leached Archean rocks, suburban retention pond waters, blood and urine samples and most samples show small but detectible levels of reduced phosphorus and or organo-phaospates. This finding in nearly all substances analyzed supports the assumption that the redox processing of phosphorus has played a significant role throughout the history of the Earth and it's presence in the present environment is nearly ubiquitous with the reduced oxidation state phosphorus compounds, phosphite and hypophosphite, potentially acting as significant constituents in the anaerobic environment.

  9. Reduced molybenum-oxide-based core-shell hybrids: "blue" electrons are delocalized on the shell.

    PubMed

    Todea, Ana Maria; Szakács, Julia; Konar, Sanjit; Bögge, Hartmut; Crans, Debbie C; Glaser, Thorsten; Rousselière, Hélène; Thouvenot, René; Gouzerh, Pierre; Müller, Achim

    2011-06-06

    The present study refers to a variety of reduced metal-oxide core-shell hybrids, which are unique with regard to their electronic structure, their geometry, and their formation. They contain spherical {Mo72Fe30} Keplerate-type shells encapsulating Keggin-type polyoxomolybdates based on very weak interactions. Studies on the encapsulation of molybdosilicate as well as on the earlier reported molybdophosphate, coupled with the use of several physical methods for the characterization led to unprecedented results (see title). Upon standing in air at room temperature, acidified aqueous solutions obtained by dissolving sodium molybdate, iron(II) chloride, acetic acid, and molybdosilicic acid led to the precipitation of monoclinic greenish crystals (1). A rhombohedral variant (2) has also been observed. Upon drying at room temperature, compound 3 with a layer structure was obtained from 1 in a solid-state reaction based on cross-linking of the shells. The compounds 1, 2, and 3 have been characterized by a combination of methods including single-crystal X-ray crystallography, magnetic studies, as well as IR, Mössbauer, (resonance) Raman, and electronic absorption spectroscopy. In connection with detailed studies of the guest-free two-electron-reduced {Mo72Fe30}-type Keplerate (4) and of the previously reported molybdophosphate-based hybrids (including 31P NMR spectroscopy results), it is unambiguously proved that 1, 2, and 3 contain non-reduced Keggin ion cores and reduced {Mo72Fe30}-type shells. The results are discussed in terms of redox considerations (the shell as well as the core can be reduced) including those related to the reduction of "molybdates" by FeII being of interdisciplinary including catalytic interest (the MoVI/MoV and FeIII/FeII couples have very close redox potentials!), while also referring to the special formation of the hybrids based on chemical Darwinism.

  10. Efficient geometry optimization by Hellmann-Feynman forces with the anti-Hermitian contracted Schrödinger equation

    NASA Astrophysics Data System (ADS)

    Foley, Jonathan J.; Mazziotti, David A.

    2010-10-01

    An efficient method for geometry optimization based on solving the anti-Hermitian contracted Schrödinger equation (ACSE) is presented. We formulate a reduced version of the Hellmann-Feynman theorem (HFT) in terms of the two-electron reduced Hamiltonian operator and the two-electron reduced density matrix (2-RDM). The HFT offers a considerable reduction in computational cost over methods which rely on numerical derivatives. While previous geometry optimizations with numerical gradients required 2M evaluations of the ACSE where M is the number of nuclear degrees of freedom, the HFT requires only a single ACSE calculation of the 2-RDM per gradient. Synthesizing geometry optimization techniques with recent extensions of the ACSE theory to arbitrary electronic and spin states provides an important suite of tools for accurately determining equilibrium and transition-state structures of ground- and excited-state molecules in closed- and open-shell configurations. The ability of the ACSE to balance single- and multi-reference correlation is particularly advantageous in the determination of excited-state geometries where the electronic configurations differ greatly from the ground-state reference. Applications are made to closed-shell molecules N2, CO, H2O, the open-shell molecules B2 and CH, and the excited state molecules N2, B2, and BH. We also study the HCN ↔ HNC isomerization and the geometry optimization of hydroxyurea, a molecule which has a significant role in the treatment of sickle-cell anaemia.

  11. Limitations and possibilities of low cell number ChIP-seq

    PubMed Central

    2012-01-01

    Background Chromatin immunoprecipitation coupled with high-throughput DNA sequencing (ChIP-seq) offers high resolution, genome-wide analysis of DNA-protein interactions. However, current standard methods require abundant starting material in the range of 1–20 million cells per immunoprecipitation, and remain a bottleneck to the acquisition of biologically relevant epigenetic data. Using a ChIP-seq protocol optimised for low cell numbers (down to 100,000 cells / IP), we examined the performance of the ChIP-seq technique on a series of decreasing cell numbers. Results We present an enhanced native ChIP-seq method tailored to low cell numbers that represents a 200-fold reduction in input requirements over existing protocols. The protocol was tested over a range of starting cell numbers covering three orders of magnitude, enabling determination of the lower limit of the technique. At low input cell numbers, increased levels of unmapped and duplicate reads reduce the number of unique reads generated, and can drive up sequencing costs and affect sensitivity if ChIP is attempted from too few cells. Conclusions The optimised method presented here considerably reduces the input requirements for performing native ChIP-seq. It extends the applicability of the technique to isolated primary cells and rare cell populations (e.g. biobank samples, stem cells), and in many cases will alleviate the need for cell culture and any associated alteration of epigenetic marks. However, this study highlights a challenge inherent to ChIP-seq from low cell numbers: as cell input numbers fall, levels of unmapped sequence reads and PCR-generated duplicate reads rise. We discuss a number of solutions to overcome the effects of reducing cell number that may aid further improvements to ChIP performance. PMID:23171294

  12. Mechanical response tissue analyzer for estimating bone strength

    NASA Technical Reports Server (NTRS)

    Arnaud, Sara B.; Steele, Charles; Mauriello, Anthony

    1991-01-01

    One of the major concerns for extended space flight is weakness of the long bones of the legs, composed primarily of cortical bone, that functions to provide mechanical support. The strength of cortical bone is due to its complex structure, described simplistically as cylinders of parallel osteons composed of layers of mineralized collagen. The reduced mechanical stresses during space flight or immobilization of bone on Earth reduces the mineral content, and changes the components of its matrix and structure so that its strength is reduced. Currently, the established clinical measures of bone strength are indirect. The measures are based on determinations of mineral density by means of radiography, photon absorptiometry, and quantitative computer tomography. While the mineral content of bone is essential to its strength, there is growing awareness of the limitations of the measurement as the sole predictor of fracture risk in metabolic bone diseases, especially limitations of the measurement as the sole predictor of fracture risk in metabolic bone diseases, especially osteoporosis. Other experimental methods in clinical trials that more directly evaluate the physical properties of bone, and do not require exposure to radiation, include ultrasound, acoustic emission, and low-frequency mechanical vibration. The last method can be considered a direct measure of the functional capacity of a long bone since it quantifies the mechanical response to a stimulus delivered directly to the bone. A low frequency vibration induces a response (impedance) curve with a minimum at the resonant frequency, that a few investigators use for the evaluation of the bone. An alternative approach, the method under consideration, is to use the response curve as the basis for determination of the bone bending stiffness EI (E is the intrinsic material property and I is the cross-sectional moment of inertia) and mass, fundamental mechanical properties of bone.

  13. A novel seizure detection algorithm informed by hidden Markov model event states

    NASA Astrophysics Data System (ADS)

    Baldassano, Steven; Wulsin, Drausin; Ung, Hoameng; Blevins, Tyler; Brown, Mesha-Gay; Fox, Emily; Litt, Brian

    2016-06-01

    Objective. Recently the FDA approved the first responsive, closed-loop intracranial device to treat epilepsy. Because these devices must respond within seconds of seizure onset and not miss events, they are tuned to have high sensitivity, leading to frequent false positive stimulations and decreased battery life. In this work, we propose a more robust seizure detection model. Approach. We use a Bayesian nonparametric Markov switching process to parse intracranial EEG (iEEG) data into distinct dynamic event states. Each event state is then modeled as a multidimensional Gaussian distribution to allow for predictive state assignment. By detecting event states highly specific for seizure onset zones, the method can identify precise regions of iEEG data associated with the transition to seizure activity, reducing false positive detections associated with interictal bursts. The seizure detection algorithm was translated to a real-time application and validated in a small pilot study using 391 days of continuous iEEG data from two dogs with naturally occurring, multifocal epilepsy. A feature-based seizure detector modeled after the NeuroPace RNS System was developed as a control. Main results. Our novel seizure detection method demonstrated an improvement in false negative rate (0/55 seizures missed versus 2/55 seizures missed) as well as a significantly reduced false positive rate (0.0012 h versus 0.058 h-1). All seizures were detected an average of 12.1 ± 6.9 s before the onset of unequivocal epileptic activity (unequivocal epileptic onset (UEO)). Significance. This algorithm represents a computationally inexpensive, individualized, real-time detection method suitable for implantable antiepileptic devices that may considerably reduce false positive rate relative to current industry standards.

  14. Short tandem repeat profiling: part of an overall strategy for reducing the frequency of cell misidentification.

    PubMed

    Nims, Raymond W; Sykes, Greg; Cottrill, Karin; Ikonomi, Pranvera; Elmore, Eugene

    2010-12-01

    The role of cell authentication in biomedical science has received considerable attention, especially within the past decade. This quality control attribute is now beginning to be given the emphasis it deserves by granting agencies and by scientific journals. Short tandem repeat (STR) profiling, one of a few DNA profiling technologies now available, is being proposed for routine identification (authentication) of human cell lines, stem cells, and tissues. The advantage of this technique over methods such as isoenzyme analysis, karyotyping, human leukocyte antigen typing, etc., is that STR profiling can establish identity to the individual level, provided that the appropriate number and types of loci are evaluated. To best employ this technology, a standardized protocol and a data-driven, quality-controlled, and publically searchable database will be necessary. This public STR database (currently under development) will enable investigators to rapidly authenticate human-based cultures to the individual from whom the cells were sourced. Use of similar approaches for non-human animal cells will require developing other suitable loci sets. While implementing STR analysis on a more routine basis should significantly reduce the frequency of cell misidentification, additional technologies may be needed as part of an overall authentication paradigm. For instance, isoenzyme analysis, PCR-based DNA amplification, and sequence-based barcoding methods enable rapid confirmation of a cell line's species of origin while screening against cross-contaminations, especially when the cells present are not recognized by the species-specific STR method. Karyotyping may also be needed as a supporting tool during establishment of an STR database. Finally, good cell culture practices must always remain a major component of any effort to reduce the frequency of cell misidentification.

  15. [A new method of endoscopic harvesting of the great saphenous vein in an open system].

    PubMed

    Vecherskiĭ, Iu Iu; Zatolokin, V V; Petlin, K A; Akhmedov, Sh D; Shipulin, V M

    We examined a total of 246 patients subjected to coronary artery bypass grafting with the use of the great saphenous vein (GSV). The patients were subdivided into two groups. Group One (n=121) patients endured procurement of the great saphenous vein by a new endoscopic technique in an open system with the help of the equipment Karl Storz and electric dissector Ligasure. In Group Two (n=125) patients the vein was harvested by means of the traditional open method. In all patients we evaluated complications in the early postoperative period 13±2.5 days after the operation. The rate of relapsing angina pectoris in both Groups turned out to be low and did not differ (1.65% in Group One and 1.6% in Group Two). Patients of the both groups differed significantly by the incidence of postoperative complications on the lower limbs in the zone of procurement of the GSV (9.09% in Group One and 26.4% in Group Two, p=0.131). Group Two patients (open method of procurement of the GSV) were considerably more often found to have developed cases of lymphorrhoea, haematomas, disjunction of the sutures (21.6%) compared with Group One (endoscopic method) patients (3.3%) (p=0.167), which in 10.4% of cases required secondary surgical debridement of wounds in patients after the open harvest of the GSV. Eventually, the length of hospital stay for Group Two patients increased significantly (15 ± 4.5 days) compared with Group One patients (8±1.1 days) (p=0.361). Hence, the endoscopic method of harvesting the GSV in the open CO2 system makes it possible to obtain a good cosmetic effect on the lower limbs after the operation, to considerably decrease the complications rate, thus reducing the length of hospital stay.

  16. Bioanalytical method transfer considerations of chromatographic-based assays.

    PubMed

    Williard, Clark V

    2016-07-01

    Bioanalysis is an important part of the modern drug development process. The business practice of outsourcing and transferring bioanalytical methods from laboratory to laboratory has increasingly become a crucial strategy for successful and efficient delivery of therapies to the market. This chapter discusses important considerations when transferring various types of chromatographic-based assays in today's pharmaceutical research and development environment.

  17. Optimization of the operating conditions of gas-turbine power stations considering the effect of equipment deterioration

    NASA Astrophysics Data System (ADS)

    Aminov, R. Z.; Kozhevnikov, A. I.

    2017-10-01

    In recent years in most power systems all over the world, a trend towards the growing nonuniformity of energy consumption and generation schedules has been observed. The increase in the portion of renewable energy sources is one of the important challenges for many countries. The ill-predictable character of such energy sources necessitates a search for practical solutions. Presently, the most efficient method for compensating for nonuniform generation of the electric power by the renewable energy sources—predominantly by the wind and solar energy—is generation of power at conventional fossil-fuel-fired power stations. In Russia, this problem is caused by the increasing portion in the generating capacity structure of the nuclear power stations, which are most efficient when operating under basic conditions. Introduction of hydropower and pumped storage hydroelectric power plants and other energy-storage technologies does not cover the demand for load-following power capacities. Owing to a simple design, low construction costs, and a sufficiently high economic efficiency, gas turbine plants (GTPs) prove to be the most suitable for covering the nonuniform electric-demand schedules. However, when the gas turbines are operated under varying duty conditions, the lifetime of the primary thermostressed components is considerably reduced and, consequently, the repair costs increase. A method is proposed for determination of the total operating costs considering the deterioration of the gas turbine equipment under varying duty and start-stop conditions. A methodology for optimization of the loading modes for the gas turbine equipment is developed. The consideration of the lifetime component allows varying the optimal operating conditions and, in some cases, rejecting short-time stops of the gas turbine plants. The calculations performed in a wide range of varying fuel prices and capital investments per gas turbine equipment unit show that the economic effectiveness can be increased by 5-15% by varying the operating conditions and switching to the optimal operating modes. Consequently, irrespective of the fuel price, the application of the proposed method results in selection of the most beneficial operating conditions. Consideration of the lifetime expenditure included in the optimization criterion enables enhancement of the operating efficiency.

  18. Direct SQP-methods for solving optimal control problems with delays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goellmann, L.; Bueskens, C.; Maurer, H.

    The maximum principle for optimal control problems with delays leads to a boundary value problem (BVP) which is retarded in the state and advanced in the costate function. Based on shooting techniques, solution methods for this type of BVP have been proposed. In recent years, direct optimization methods have been favored for solving control problems without delays. Direct methods approximate the control and the state over a fixed mesh and solve the resulting NLP-problem with SQP-methods. These methods dispense with the costate function and have shown to be robust and efficient. In this paper, we propose a direct SQP-method formore » retarded control problems. In contrast to conventional direct methods, only the control variable is approximated by e.g. spline-functions. The state is computed via a high order Runge-Kutta type algorithm and does not enter explicitly the NLP-problem through an equation. This approach reduces the number of optimization variables considerably and is implementable even on a PC. Our method is illustrated by the numerical solution of retarded control problems with constraints. In particular, we consider the control of a continuous stirred tank reactor which has been solved by dynamic programming. This example illustrates the robustness and efficiency of the proposed method. Open questions concerning sufficient conditions and convergence of discretized NLP-problems are discussed.« less

  19. Alternative approaches for vertebrate ecotoxicity tests in the ...

    EPA Pesticide Factsheets

    The need for alternative approaches to the use of vertebrate animals for hazard assessing chemicals and pollutants has become of increasing importance. It is now the first consideration when initiating a vertebrate ecotoxicity test, to ensure that unnecessary use of vertebrate organisms is minimised wherever possible. For some regulatory purposes, the use of vertebrate organisms for environmental risk assessments (ERA) has even been banned, and in other situations the numbers of organisms tested has been dramatically reduced, or the severity of the procedure refined. However, there is still a long way to go to achieve replacement of vertebrate organisms to generate environmental hazard data. The development of animal alternatives is not just based on ethical considerations but also to reduce the cost of performing vertebrate ecotoxicity tests and in some cases to provide better information aimed at improving ERAs. The present focus paper provides an overview of the considerable advances that have been made towards alternative approaches for ecotoxicity assessments over the last few decades. The need for alternative approaches to the use of vertebrate animals for hazard assessing chemicals and pollutants has become of increasing importance. It is now the first consideration when initiating a vertebrate ecotoxicity test, to ensure that unnecessary use of vertebrate organisms is minimised wherever possible. For some regulatory purposes, the use of vertebrate organi

  20. Design of an Elliptic Curve Cryptography processor for RFID tag chips.

    PubMed

    Liu, Zilong; Liu, Dongsheng; Zou, Xuecheng; Lin, Hui; Cheng, Jian

    2014-09-26

    Radio Frequency Identification (RFID) is an important technique for wireless sensor networks and the Internet of Things. Recently, considerable research has been performed in the combination of public key cryptography and RFID. In this paper, an efficient architecture of Elliptic Curve Cryptography (ECC) Processor for RFID tag chip is presented. We adopt a new inversion algorithm which requires fewer registers to store variables than the traditional schemes. A new method for coordinate swapping is proposed, which can reduce the complexity of the controller and shorten the time of iterative calculation effectively. A modified circular shift register architecture is presented in this paper, which is an effective way to reduce the area of register files. Clock gating and asynchronous counter are exploited to reduce the power consumption. The simulation and synthesis results show that the time needed for one elliptic curve scalar point multiplication over GF(2163) is 176.7 K clock cycles and the gate area is 13.8 K with UMC 0.13 μm Complementary Metal Oxide Semiconductor (CMOS) technology. Moreover, the low power and low cost consumption make the Elliptic Curve Cryptography Processor (ECP) a prospective candidate for application in the RFID tag chip.

Top