Sample records for cleaning applications algorithms

  1. Quantum machine learning for quantum anomaly detection

    NASA Astrophysics Data System (ADS)

    Liu, Nana; Rebentrost, Patrick

    2018-04-01

    Anomaly detection is used for identifying data that deviate from "normal" data patterns. Its usage on classical data finds diverse applications in many important areas such as finance, fraud detection, medical diagnoses, data cleaning, and surveillance. With the advent of quantum technologies, anomaly detection of quantum data, in the form of quantum states, may become an important component of quantum applications. Machine-learning algorithms are playing pivotal roles in anomaly detection using classical data. Two widely used algorithms are the kernel principal component analysis and the one-class support vector machine. We find corresponding quantum algorithms to detect anomalies in quantum states. We show that these two quantum algorithms can be performed using resources that are logarithmic in the dimensionality of quantum states. For pure quantum states, these resources can also be logarithmic in the number of quantum states used for training the machine-learning algorithm. This makes these algorithms potentially applicable to big quantum data applications.

  2. Development and validation of an algorithm for laser application in wound treatment 1

    PubMed Central

    da Cunha, Diequison Rite; Salomé, Geraldo Magela; Massahud, Marcelo Renato; Mendes, Bruno; Ferreira, Lydia Masako

    2017-01-01

    ABSTRACT Objective: To develop and validate an algorithm for laser wound therapy. Method: Methodological study and literature review. For the development of the algorithm, a review was performed in the Health Sciences databases of the past ten years. The algorithm evaluation was performed by 24 participants, nurses, physiotherapists, and physicians. For data analysis, the Cronbach’s alpha coefficient and the chi-square test for independence was used. The level of significance of the statistical test was established at 5% (p<0.05). Results: The professionals’ responses regarding the facility to read the algorithm indicated: 41.70%, great; 41.70%, good; 16.70%, regular. With regard the algorithm being sufficient for supporting decisions related to wound evaluation and wound cleaning, 87.5% said yes to both questions. Regarding the participants’ opinion that the algorithm contained enough information to support their decision regarding the choice of laser parameters, 91.7% said yes. The questionnaire presented reliability using the Cronbach’s alpha coefficient test (α = 0.962). Conclusion: The developed and validated algorithm showed reliability for evaluation, wound cleaning, and use of laser therapy in wounds. PMID:29211197

  3. Novel Signal Noise Reduction Method through Cluster Analysis, Applied to Photoplethysmography.

    PubMed

    Waugh, William; Allen, John; Wightman, James; Sims, Andrew J; Beale, Thomas A W

    2018-01-01

    Physiological signals can often become contaminated by noise from a variety of origins. In this paper, an algorithm is described for the reduction of sporadic noise from a continuous periodic signal. The design can be used where a sample of a periodic signal is required, for example, when an average pulse is needed for pulse wave analysis and characterization. The algorithm is based on cluster analysis for selecting similar repetitions or pulses from a periodic single. This method selects individual pulses without noise, returns a clean pulse signal, and terminates when a sufficiently clean and representative signal is received. The algorithm is designed to be sufficiently compact to be implemented on a microcontroller embedded within a medical device. It has been validated through the removal of noise from an exemplar photoplethysmography (PPG) signal, showing increasing benefit as the noise contamination of the signal increases. The algorithm design is generalised to be applicable for a wide range of physiological (physical) signals.

  4. An optimized algorithm for multiscale wideband deconvolution of radio astronomical images

    NASA Astrophysics Data System (ADS)

    Offringa, A. R.; Smirnov, O.

    2017-10-01

    We describe a new multiscale deconvolution algorithm that can also be used in a multifrequency mode. The algorithm only affects the minor clean loop. In single-frequency mode, the minor loop of our improved multiscale algorithm is over an order of magnitude faster than the casa multiscale algorithm, and produces results of similar quality. For multifrequency deconvolution, a technique named joined-channel cleaning is used. In this mode, the minor loop of our algorithm is two to three orders of magnitude faster than casa msmfs. We extend the multiscale mode with automated scale-dependent masking, which allows structures to be cleaned below the noise. We describe a new scale-bias function for use in multiscale cleaning. We test a second deconvolution method that is a variant of the moresane deconvolution technique, and uses a convex optimization technique with isotropic undecimated wavelets as dictionary. On simple well-calibrated data, the convex optimization algorithm produces visually more representative models. On complex or imperfect data, the convex optimization algorithm has stability issues.

  5. Detection of motion artifact patterns in photoplethysmographic signals based on time and period domain analysis.

    PubMed

    Couceiro, R; Carvalho, P; Paiva, R P; Henriques, J; Muehlsteff, J

    2014-12-01

    The presence of motion artifacts in photoplethysmographic (PPG) signals is one of the major obstacles in the extraction of reliable cardiovascular parameters in continuous monitoring applications. In the current paper we present an algorithm for motion artifact detection based on the analysis of the variations in the time and the period domain characteristics of the PPG signal. The extracted features are ranked using a normalized mutual information feature selection algorithm and the best features are used in a support vector machine classification model to distinguish between clean and corrupted sections of the PPG signal. The proposed method has been tested in healthy and cardiovascular diseased volunteers, considering 11 different motion artifact sources. The results achieved by the current algorithm (sensitivity--SE: 84.3%, specificity--SP: 91.5% and accuracy--ACC: 88.5%) show that the current methodology is able to identify both corrupted and clean PPG sections with high accuracy in both healthy (ACC: 87.5%) and cardiovascular diseases (ACC: 89.5%) context.

  6. Data cleaning in the energy domain

    NASA Astrophysics Data System (ADS)

    Akouemo Kengmo Kenfack, Hermine N.

    This dissertation addresses the problem of data cleaning in the energy domain, especially for natural gas and electric time series. The detection and imputation of anomalies improves the performance of forecasting models necessary to lower purchasing and storage costs for utilities and plan for peak energy loads or distribution shortages. There are various types of anomalies, each induced by diverse causes and sources depending on the field of study. The definition of false positives also depends on the context. The analysis is focused on energy data because of the availability of data and information to make a theoretical and practical contribution to the field. A probabilistic approach based on hypothesis testing is developed to decide if a data point is anomalous based on the level of significance. Furthermore, the probabilistic approach is combined with statistical regression models to handle time series data. Domain knowledge of energy data and the survey of causes and sources of anomalies in energy are incorporated into the data cleaning algorithm to improve the accuracy of the results. The data cleaning method is evaluated on simulated data sets in which anomalies were artificially inserted and on natural gas and electric data sets. In the simulation study, the performance of the method is evaluated for both detection and imputation on all identified causes of anomalies in energy data. The testing on utilities' data evaluates the percentage of improvement brought to forecasting accuracy by data cleaning. A cross-validation study of the results is also performed to demonstrate the performance of the data cleaning algorithm on smaller data sets and to calculate an interval of confidence for the results. The data cleaning algorithm is able to successfully identify energy time series anomalies. The replacement of those anomalies provides improvement to forecasting models accuracy. The process is automatic, which is important because many data cleaning processes require human input and become impractical for very large data sets. The techniques are also applicable to other fields such as econometrics and finance, but the exogenous factors of the time series data need to be well defined.

  7. ERAASR: an algorithm for removing electrical stimulation artifacts from multielectrode array recordings

    NASA Astrophysics Data System (ADS)

    O'Shea, Daniel J.; Shenoy, Krishna V.

    2018-04-01

    Objective. Electrical stimulation is a widely used and effective tool in systems neuroscience, neural prosthetics, and clinical neurostimulation. However, electrical artifacts evoked by stimulation prevent the detection of spiking activity on nearby recording electrodes, which obscures the neural population response evoked by stimulation. We sought to develop a method to clean artifact-corrupted electrode signals recorded on multielectrode arrays in order to recover the underlying neural spiking activity. Approach. We created an algorithm, which performs estimation and removal of array artifacts via sequential principal components regression (ERAASR). This approach leverages the similar structure of artifact transients, but not spiking activity, across simultaneously recorded channels on the array, across pulses within a train, and across trials. The ERAASR algorithm requires no special hardware, imposes no requirements on the shape of the artifact or the multielectrode array geometry, and comprises sequential application of straightforward linear methods with intuitive parameters. The approach should be readily applicable to most datasets where stimulation does not saturate the recording amplifier. Main results. The effectiveness of the algorithm is demonstrated in macaque dorsal premotor cortex using acute linear multielectrode array recordings and single electrode stimulation. Large electrical artifacts appeared on all channels during stimulation. After application of ERAASR, the cleaned signals were quiescent on channels with no spontaneous spiking activity, whereas spontaneously active channels exhibited evoked spikes which closely resembled spontaneously occurring spiking waveforms. Significance. We hope that enabling simultaneous electrical stimulation and multielectrode array recording will help elucidate the causal links between neural activity and cognition and facilitate naturalistic sensory protheses.

  8. An improved method for polarimetric image restoration in interferometry

    NASA Astrophysics Data System (ADS)

    Pratley, Luke; Johnston-Hollitt, Melanie

    2016-11-01

    Interferometric radio astronomy data require the effects of limited coverage in the Fourier plane to be accounted for via a deconvolution process. For the last 40 years this process, known as `cleaning', has been performed almost exclusively on all Stokes parameters individually as if they were independent scalar images. However, here we demonstrate for the case of the linear polarization P, this approach fails to properly account for the complex vector nature resulting in a process which is dependent on the axes under which the deconvolution is performed. We present here an improved method, `Generalized Complex CLEAN', which properly accounts for the complex vector nature of polarized emission and is invariant under rotations of the deconvolution axes. We use two Australia Telescope Compact Array data sets to test standard and complex CLEAN versions of the Högbom and SDI (Steer-Dwedney-Ito) CLEAN algorithms. We show that in general the complex CLEAN version of each algorithm produces more accurate clean components with fewer spurious detections and lower computation cost due to reduced iterations than the current methods. In particular, we find that the complex SDI CLEAN produces the best results for diffuse polarized sources as compared with standard CLEAN algorithms and other complex CLEAN algorithms. Given the move to wide-field, high-resolution polarimetric imaging with future telescopes such as the Square Kilometre Array, we suggest that Generalized Complex CLEAN should be adopted as the deconvolution method for all future polarimetric surveys and in particular that the complex version of an SDI CLEAN should be used.

  9. The Research and Implementation of MUSER CLEAN Algorithm Based on OpenCL

    NASA Astrophysics Data System (ADS)

    Feng, Y.; Chen, K.; Deng, H.; Wang, F.; Mei, Y.; Wei, S. L.; Dai, W.; Yang, Q. P.; Liu, Y. B.; Wu, J. P.

    2017-03-01

    It's urgent to carry out high-performance data processing with a single machine in the development of astronomical software. However, due to the different configuration of the machine, traditional programming techniques such as multi-threading, and CUDA (Compute Unified Device Architecture)+GPU (Graphic Processing Unit) have obvious limitations in portability and seamlessness between different operation systems. The OpenCL (Open Computing Language) used in the development of MUSER (MingantU SpEctral Radioheliograph) data processing system is introduced. And the Högbom CLEAN algorithm is re-implemented into parallel CLEAN algorithm by the Python language and PyOpenCL extended package. The experimental results show that the CLEAN algorithm based on OpenCL has approximately equally operating efficiency compared with the former CLEAN algorithm based on CUDA. More important, the data processing in merely CPU (Central Processing Unit) environment of this system can also achieve high performance, which has solved the problem of environmental dependence of CUDA+GPU. Overall, the research improves the adaptability of the system with emphasis on performance of MUSER image clean computing. In the meanwhile, the realization of OpenCL in MUSER proves its availability in scientific data processing. In view of the high-performance computing features of OpenCL in heterogeneous environment, it will probably become the preferred technology in the future high-performance astronomical software development.

  10. Navigation of autonomous vehicles for oil spill cleaning in dynamic and uncertain environments

    NASA Astrophysics Data System (ADS)

    Jin, Xin; Ray, Asok

    2014-04-01

    In the context of oil spill cleaning by autonomous vehicles in dynamic and uncertain environments, this paper presents a multi-resolution algorithm that seamlessly integrates the concepts of local navigation and global navigation based on the sensory information; the objective here is to enable adaptive decision making and online replanning of vehicle paths. The proposed algorithm provides a complete coverage of the search area for clean-up of the oil spills and does not suffer from the problem of having local minima, which is commonly encountered in potential-field-based methods. The efficacy of the algorithm is tested on a high-fidelity player/stage simulator for oil spill cleaning in a harbour, where the underlying oil weathering process is modelled as 2D random-walk particle tracking. A preliminary version of this paper was presented by X. Jin and A. Ray as 'Coverage Control of Autonomous Vehicles for Oil Spill Cleaning in Dynamic and Uncertain Environments', Proceedings of the American Control Conference, Washington, DC, June 2013, pp. 2600-2605.

  11. A Force-Controllable Macro-Micro Manipulator and its Application to Medical Robotics

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Uecker, Darrin R.; Wang, Yulun

    1993-01-01

    This paper describes an 8-degrees-of-freedom macro-micro robot. This robot is capable of performing tasks that require accurate force control, such as polishing, finishing, grinding, deburring, and cleaning. The design of the macro-micro mechanism, the control algorithms, and the hardware/sofware implemtation of the algotithms are described in this paper. Initial experimental results are reported.

  12. Wideband RELAX and wideband CLEAN for aeroacoustic imaging

    NASA Astrophysics Data System (ADS)

    Wang, Yanwei; Li, Jian; Stoica, Petre; Sheplak, Mark; Nishida, Toshikazu

    2004-02-01

    Microphone arrays can be used for acoustic source localization and characterization in wind tunnel testing. In this paper, the wideband RELAX (WB-RELAX) and the wideband CLEAN (WB-CLEAN) algorithms are presented for aeroacoustic imaging using an acoustic array. WB-RELAX is a parametric approach that can be used efficiently for point source imaging without the sidelobe problems suffered by the delay-and-sum beamforming approaches. WB-CLEAN does not have sidelobe problems either, but it behaves more like a nonparametric approach and can be used for both point source and distributed source imaging. Moreover, neither of the algorithms suffers from the severe performance degradations encountered by the adaptive beamforming methods when the number of snapshots is small and/or the sources are highly correlated or coherent with each other. A two-step optimization procedure is used to implement the WB-RELAX and WB-CLEAN algorithms efficiently. The performance of WB-RELAX and WB-CLEAN is demonstrated by applying them to measured data obtained at the NASA Langley Quiet Flow Facility using a small aperture directional array (SADA). Somewhat surprisingly, using these approaches, not only were the parameters of the dominant source accurately determined, but a highly correlated multipath of the dominant source was also discovered.

  13. Wideband RELAX and wideband CLEAN for aeroacoustic imaging.

    PubMed

    Wang, Yanwei; Li, Jian; Stoica, Petre; Sheplak, Mark; Nishida, Toshikazu

    2004-02-01

    Microphone arrays can be used for acoustic source localization and characterization in wind tunnel testing. In this paper, the wideband RELAX (WB-RELAX) and the wideband CLEAN (WB-CLEAN) algorithms are presented for aeroacoustic imaging using an acoustic array. WB-RELAX is a parametric approach that can be used efficiently for point source imaging without the sidelobe problems suffered by the delay-and-sum beamforming approaches. WB-CLEAN does not have sidelobe problems either, but it behaves more like a nonparametric approach and can be used for both point source and distributed source imaging. Moreover, neither of the algorithms suffers from the severe performance degradations encountered by the adaptive beamforming methods when the number of snapshots is small and/or the sources are highly correlated or coherent with each other. A two-step optimization procedure is used to implement the WB-RELAX and WB-CLEAN algorithms efficiently. The performance of WB-RELAX and WB-CLEAN is demonstrated by applying them to measured data obtained at the NASA Langley Quiet Flow Facility using a small aperture directional array (SADA). Somewhat surprisingly, using these approaches, not only were the parameters of the dominant source accurately determined, but a highly correlated multipath of the dominant source was also discovered.

  14. RANdom SAmple Consensus (RANSAC) algorithm for material-informatics: application to photovoltaic solar cells.

    PubMed

    Kaspi, Omer; Yosipof, Abraham; Senderowitz, Hanoch

    2017-06-06

    An important aspect of chemoinformatics and material-informatics is the usage of machine learning algorithms to build Quantitative Structure Activity Relationship (QSAR) models. The RANdom SAmple Consensus (RANSAC) algorithm is a predictive modeling tool widely used in the image processing field for cleaning datasets from noise. RANSAC could be used as a "one stop shop" algorithm for developing and validating QSAR models, performing outlier removal, descriptors selection, model development and predictions for test set samples using applicability domain. For "future" predictions (i.e., for samples not included in the original test set) RANSAC provides a statistical estimate for the probability of obtaining reliable predictions, i.e., predictions within a pre-defined number of standard deviations from the true values. In this work we describe the first application of RNASAC in material informatics, focusing on the analysis of solar cells. We demonstrate that for three datasets representing different metal oxide (MO) based solar cell libraries RANSAC-derived models select descriptors previously shown to correlate with key photovoltaic properties and lead to good predictive statistics for these properties. These models were subsequently used to predict the properties of virtual solar cells libraries highlighting interesting dependencies of PV properties on MO compositions.

  15. Improvement of training set structure in fusion data cleaning using Time-Domain Global Similarity method

    NASA Astrophysics Data System (ADS)

    Liu, J.; Lan, T.; Qin, H.

    2017-10-01

    Traditional data cleaning identifies dirty data by classifying original data sequences, which is a class-imbalanced problem since the proportion of incorrect data is much less than the proportion of correct ones for most diagnostic systems in Magnetic Confinement Fusion (MCF) devices. When using machine learning algorithms to classify diagnostic data based on class-imbalanced training set, most classifiers are biased towards the major class and show very poor classification rates on the minor class. By transforming the direct classification problem about original data sequences into a classification problem about the physical similarity between data sequences, the class-balanced effect of Time-Domain Global Similarity (TDGS) method on training set structure is investigated in this paper. Meanwhile, the impact of improved training set structure on data cleaning performance of TDGS method is demonstrated with an application example in EAST POlarimetry-INTerferometry (POINT) system.

  16. Oxygen transfer rate estimation in oxidation ditches from clean water measurements.

    PubMed

    Abusam, A; Keesman, K J; Meinema, K; Van Straten, G

    2001-06-01

    Standard methods for the determination of oxygen transfer rate are based on assumptions that are not valid for oxidation ditches. This paper presents a realistic and simple new method to be used in the estimation of oxygen transfer rate in oxidation ditches from clean water measurements. The new method uses a loop-of-CSTRs model, which can be easily incorporated within control algorithms, for modelling oxidation ditches. Further, this method assumes zero oxygen transfer rates (KLa) in the unaerated CSTRs. Application of a formal estimation procedure to real data revealed that the aeration constant (k = KLaVA, where VA is the volume of the aerated CSTR) can be determined significantly more accurately than KLa and VA. Therefore, the new method estimates k instead of KLa. From application to real data, this method proved to be more accurate than the commonly used Dutch standard method (STORA, 1980).

  17. GraDit: graph-based data repair algorithm for multiple data edits rule violations

    NASA Astrophysics Data System (ADS)

    Ode Zuhayeni Madjida, Wa; Gusti Bagus Baskara Nugraha, I.

    2018-03-01

    Constraint-based data cleaning captures data violation to a set of rule called data quality rules. The rules consist of integrity constraint and data edits. Structurally, they are similar, where the rule contain left hand side and right hand side. Previous research proposed a data repair algorithm for integrity constraint violation. The algorithm uses undirected hypergraph as rule violation representation. Nevertheless, this algorithm can not be applied for data edits because of different rule characteristics. This study proposed GraDit, a repair algorithm for data edits rule. First, we use bipartite-directed hypergraph as model representation of overall defined rules. These representation is used for getting interaction between violation rules and clean rules. On the other hand, we proposed undirected graph as violation representation. Our experimental study showed that algorithm with undirected graph as violation representation model gave better data quality than algorithm with undirected hypergraph as representation model.

  18. Adapting sensory data for multiple robots performing spill cleanup

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Storjohann, K.; Saltzen, E.

    1990-09-01

    This paper describes a possible method of converting a single performing robot algorithm into a multiple performing robot algorithm without the need to modify previously written codes. The algorithm to be converted involves spill detection and clean up by the HERMIES-III mobile robot. In order to achieve the goal of multiple performing robots with this algorithm, two steps are taken. First, the task is formally divided into two sub-tasks, spill detection and spill clean-up, the former of which is allocated to the added performing robot, HERMIES-IIB. Second, a inverse perspective mapping, is applied to the data acquired by the newmore » performing robot (HERMIES-IIB), allowing the data to be processed by the previously written algorithm without re-writing the code. 6 refs., 4 figs.« less

  19. RM-CLEAN: RM spectra cleaner

    NASA Astrophysics Data System (ADS)

    Heald, George

    2017-08-01

    RM-CLEAN reads in dirty Q and U cubes, generates rmtf based on the frequencies given in an ASCII file, and cleans the RM spectra following the algorithm given by Brentjens (2007). The output cubes contain the clean model components and the CLEANed RM spectra. The input cubes must be reordered with mode=312, and the output cubes will have the same ordering and thus must be reordered after being written to disk. RM-CLEAN runs as a MIRIAD (ascl:1106.007) task and a Python wrapper is included with the code.

  20. A laboratory demonstration of high-resolution hard X-ray and gamma-ray imaging using Fourier-transform techniques

    NASA Technical Reports Server (NTRS)

    Palmer, David; Prince, Thomas A.

    1987-01-01

    A laboratory imaging system has been developed to study the use of Fourier-transform techniques in high-resolution hard X-ray and gamma-ray imaging, with particular emphasis on possible applications to high-energy astronomy. Considerations for the design of a Fourier-transform imager and the instrumentation used in the laboratory studies is described. Several analysis methods for image reconstruction are discussed including the CLEAN algorithm and maximum entropy methods. Images obtained using these methods are presented.

  1. The Ettention software package.

    PubMed

    Dahmen, Tim; Marsalek, Lukas; Marniok, Nico; Turoňová, Beata; Bogachev, Sviatoslav; Trampert, Patrick; Nickels, Stefan; Slusallek, Philipp

    2016-02-01

    We present a novel software package for the problem "reconstruction from projections" in electron microscopy. The Ettention framework consists of a set of modular building-blocks for tomographic reconstruction algorithms. The well-known block iterative reconstruction method based on Kaczmarz algorithm is implemented using these building-blocks, including adaptations specific to electron tomography. Ettention simultaneously features (1) a modular, object-oriented software design, (2) optimized access to high-performance computing (HPC) platforms such as graphic processing units (GPU) or many-core architectures like Xeon Phi, and (3) accessibility to microscopy end-users via integration in the IMOD package and eTomo user interface. We also provide developers with a clean and well-structured application programming interface (API) that allows for extending the software easily and thus makes it an ideal platform for algorithmic research while hiding most of the technical details of high-performance computing. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Optimal groundwater remediation design of pump and treat systems via a simulation-optimization approach and firefly algorithm

    NASA Astrophysics Data System (ADS)

    Javad Kazemzadeh-Parsi, Mohammad; Daneshmand, Farhang; Ahmadfard, Mohammad Amin; Adamowski, Jan; Martel, Richard

    2015-01-01

    In the present study, an optimization approach based on the firefly algorithm (FA) is combined with a finite element simulation method (FEM) to determine the optimum design of pump and treat remediation systems. Three multi-objective functions in which pumping rate and clean-up time are design variables are considered and the proposed FA-FEM model is used to minimize operating costs, total pumping volumes and total pumping rates in three scenarios while meeting water quality requirements. The groundwater lift and contaminant concentration are also minimized through the optimization process. The obtained results show the applicability of the FA in conjunction with the FEM for the optimal design of groundwater remediation systems. The performance of the FA is also compared with the genetic algorithm (GA) and the FA is found to have a better convergence rate than the GA.

  3. The development of a line-scan imaging algorithm for the detection of fecal contamination on leafy geens

    NASA Astrophysics Data System (ADS)

    Yang, Chun-Chieh; Kim, Moon S.; Chuang, Yung-Kun; Lee, Hoyoung

    2013-05-01

    This paper reports the development of a multispectral algorithm, using the line-scan hyperspectral imaging system, to detect fecal contamination on leafy greens. Fresh bovine feces were applied to the surfaces of washed loose baby spinach leaves. A hyperspectral line-scan imaging system was used to acquire hyperspectral fluorescence images of the contaminated leaves. Hyperspectral image analysis resulted in the selection of the 666 nm and 688 nm wavebands for a multispectral algorithm to rapidly detect feces on leafy greens, by use of the ratio of fluorescence intensities measured at those two wavebands (666 nm over 688 nm). The algorithm successfully distinguished most of the lowly diluted fecal spots (0.05 g feces/ml water and 0.025 g feces/ml water) and some of the highly diluted spots (0.0125 g feces/ml water and 0.00625 g feces/ml water) from the clean spinach leaves. The results showed the potential of the multispectral algorithm with line-scan imaging system for application to automated food processing lines for food safety inspection of leafy green vegetables.

  4. Design of a Maximum Power Point Tracker with Simulation, Analysis, and Comparison of Algorithms

    DTIC Science & Technology

    2012-12-01

    BLANK xxvi CHAPTER 1: INTRODUCTION It is a warm summer day. You feel the sun warm your skin and rejuvenate your motiva- tion. The sun generates more... renewable , there has been an upsurge of interest in clean and renewable energy. While more than one option is available to fill that void, the most...solar array. When this algorithm is functioning correctly, it is said to be an MPPT . 1.2 Motivation Clean and renewable energy has greatly increased

  5. Ultrasonic cleaning: Fundamental theory and application

    NASA Technical Reports Server (NTRS)

    Fuchs, F. John

    1995-01-01

    This presentation describes: the theory of ultrasonics, cavitation and implosion; the importance and application of ultrasonics in precision cleaning; explanations of ultrasonic cleaning equipment options and their application; process parameters for ultrasonic cleaning; and proper operation of ultrasonic cleaning equipment to achieve maximum results.

  6. Signal processing using sparse derivatives with applications to chromatograms and ECG

    NASA Astrophysics Data System (ADS)

    Ning, Xiaoran

    In this thesis, we investigate the sparsity exist in the derivative domain. Particularly, we focus on the type of signals which posses up to Mth (M > 0) order sparse derivatives. Efforts are put on formulating proper penalty functions and optimization problems to capture properties related to sparse derivatives, searching for fast, computationally efficient solvers. Also the effectiveness of these algorithms are applied to two real world applications. In the first application, we provide an algorithm which jointly addresses the problems of chromatogram baseline correction and noise reduction. The series of chromatogram peaks are modeled as sparse with sparse derivatives, and the baseline is modeled as a low-pass signal. A convex optimization problem is formulated so as to encapsulate these non-parametric models. To account for the positivity of chromatogram peaks, an asymmetric penalty function is also utilized with symmetric penalty functions. A robust, computationally efficient, iterative algorithm is developed that is guaranteed to converge to the unique optimal solution. The approach, termed Baseline Estimation And Denoising with Sparsity (BEADS), is evaluated and compared with two state-of-the-art methods using both simulated and real chromatogram data. Promising result is obtained. In the second application, a novel Electrocardiography (ECG) enhancement algorithm is designed also based on sparse derivatives. In the real medical environment, ECG signals are often contaminated by various kinds of noise or artifacts, for example, morphological changes due to motion artifact, non-stationary noise due to muscular contraction (EMG), etc. Some of these contaminations severely affect the usefulness of ECG signals, especially when computer aided algorithms are utilized. By solving the proposed convex l1 optimization problem, artifacts are reduced by modeling the clean ECG signal as a sum of two signals whose second and third-order derivatives (differences) are sparse respectively. At the end, the algorithm is applied to a QRS detection system and validated using the MIT-BIH Arrhythmia database (109452 anotations), resulting a sensitivity of Se = 99.87%$ and a positive prediction of +P = 99.88%.

  7. Statistical Methods in Ai: Rare Event Learning Using Associative Rules and Higher-Order Statistics

    NASA Astrophysics Data System (ADS)

    Iyer, V.; Shetty, S.; Iyengar, S. S.

    2015-07-01

    Rare event learning has not been actively researched since lately due to the unavailability of algorithms which deal with big samples. The research addresses spatio-temporal streams from multi-resolution sensors to find actionable items from a perspective of real-time algorithms. This computing framework is independent of the number of input samples, application domain, labelled or label-less streams. A sampling overlap algorithm such as Brooks-Iyengar is used for dealing with noisy sensor streams. We extend the existing noise pre-processing algorithms using Data-Cleaning trees. Pre-processing using ensemble of trees using bagging and multi-target regression showed robustness to random noise and missing data. As spatio-temporal streams are highly statistically correlated, we prove that a temporal window based sampling from sensor data streams converges after n samples using Hoeffding bounds. Which can be used for fast prediction of new samples in real-time. The Data-cleaning tree model uses a nonparametric node splitting technique, which can be learned in an iterative way which scales linearly in memory consumption for any size input stream. The improved task based ensemble extraction is compared with non-linear computation models using various SVM kernels for speed and accuracy. We show using empirical datasets the explicit rule learning computation is linear in time and is only dependent on the number of leafs present in the tree ensemble. The use of unpruned trees (t) in our proposed ensemble always yields minimum number (m) of leafs keeping pre-processing computation to n × t log m compared to N2 for Gram Matrix. We also show that the task based feature induction yields higher Qualify of Data (QoD) in the feature space compared to kernel methods using Gram Matrix.

  8. Parallel Algorithm for GPU Processing; for use in High Speed Machine Vision Sensing of Cotton Lint Trash.

    PubMed

    Pelletier, Mathew G

    2008-02-08

    One of the main hurdles standing in the way of optimal cleaning of cotton lint isthe lack of sensing systems that can react fast enough to provide the control system withreal-time information as to the level of trash contamination of the cotton lint. This researchexamines the use of programmable graphic processing units (GPU) as an alternative to thePC's traditional use of the central processing unit (CPU). The use of the GPU, as analternative computation platform, allowed for the machine vision system to gain asignificant improvement in processing time. By improving the processing time, thisresearch seeks to address the lack of availability of rapid trash sensing systems and thusalleviate a situation in which the current systems view the cotton lint either well before, orafter, the cotton is cleaned. This extended lag/lead time that is currently imposed on thecotton trash cleaning control systems, is what is responsible for system operators utilizing avery large dead-band safety buffer in order to ensure that the cotton lint is not undercleaned.Unfortunately, the utilization of a large dead-band buffer results in the majority ofthe cotton lint being over-cleaned which in turn causes lint fiber-damage as well assignificant losses of the valuable lint due to the excessive use of cleaning machinery. Thisresearch estimates that upwards of a 30% reduction in lint loss could be gained through theuse of a tightly coupled trash sensor to the cleaning machinery control systems. Thisresearch seeks to improve processing times through the development of a new algorithm forcotton trash sensing that allows for implementation on a highly parallel architecture.Additionally, by moving the new parallel algorithm onto an alternative computing platform,the graphic processing unit "GPU", for processing of the cotton trash images, a speed up ofover 6.5 times, over optimized code running on the PC's central processing unit "CPU", wasgained. The new parallel algorithm operating on the GPU was able to process a 1024x1024image in less than 17ms. At this improved speed, the image processing system's performance should now be sufficient to provide a system that would be capable of realtimefeed-back control that is in tight cooperation with the cleaning equipment.

  9. Advanced CHP Control Algorithms: Scope Specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katipamula, Srinivas; Brambley, Michael R.

    2006-04-28

    The primary objective of this multiyear project is to develop algorithms for combined heat and power systems to ensure optimal performance, increase reliability, and lead to the goal of clean, efficient, reliable and affordable next generation energy systems.

  10. Deconvolution enhanced direction of arrival estimation using one- and three-component seismic arrays applied to ocean induced microseisms

    NASA Astrophysics Data System (ADS)

    Gal, M.; Reading, A. M.; Ellingsen, S. P.; Koper, K. D.; Burlacu, R.; Gibbons, S. J.

    2016-07-01

    Microseisms in the period of 2-10 s are generated in deep oceans and near coastal regions. It is common for microseisms from multiple sources to arrive at the same time at a given seismometer. It is therefore desirable to be able to measure multiple slowness vectors accurately. Popular ways to estimate the direction of arrival of ocean induced microseisms are the conventional (fk) or adaptive (Capon) beamformer. These techniques give robust estimates, but are limited in their resolution capabilities and hence do not always detect all arrivals. One of the limiting factors in determining direction of arrival with seismic arrays is the array response, which can strongly influence the estimation of weaker sources. In this work, we aim to improve the resolution for weaker sources and evaluate the performance of two deconvolution algorithms, Richardson-Lucy deconvolution and a new implementation of CLEAN-PSF. The algorithms are tested with three arrays of different aperture (ASAR, WRA and NORSAR) using 1 month of real data each and compared with the conventional approaches. We find an improvement over conventional methods from both algorithms and the best performance with CLEAN-PSF. We then extend the CLEAN-PSF framework to three components (3C) and evaluate 1 yr of data from the Pilbara Seismic Array in northwest Australia. The 3C CLEAN-PSF analysis is capable in resolving a previously undetected Sn phase.

  11. 40 CFR 420.110 - Applicability; description of the alkaline cleaning subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... alkaline cleaning subcategory. 420.110 Section 420.110 Protection of Environment ENVIRONMENTAL PROTECTION... Alkaline Cleaning Subcategory § 420.110 Applicability; description of the alkaline cleaning subcategory... alkaline cleaning baths to remove mineral and animal fats or oils from the steel, and those rinsing...

  12. 40 CFR 420.110 - Applicability; description of the alkaline cleaning subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... alkaline cleaning subcategory. 420.110 Section 420.110 Protection of Environment ENVIRONMENTAL PROTECTION... Alkaline Cleaning Subcategory § 420.110 Applicability; description of the alkaline cleaning subcategory... alkaline cleaning baths to remove mineral and animal fats or oils from the steel, and those rinsing...

  13. 40 CFR 420.110 - Applicability; description of the alkaline cleaning subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... alkaline cleaning subcategory. 420.110 Section 420.110 Protection of Environment ENVIRONMENTAL PROTECTION... Alkaline Cleaning Subcategory § 420.110 Applicability; description of the alkaline cleaning subcategory... alkaline cleaning baths to remove mineral and animal fats or oils from the steel, and those rinsing...

  14. 40 CFR 420.110 - Applicability; description of the alkaline cleaning subcategory.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... alkaline cleaning subcategory. 420.110 Section 420.110 Protection of Environment ENVIRONMENTAL PROTECTION... Alkaline Cleaning Subcategory § 420.110 Applicability; description of the alkaline cleaning subcategory... alkaline cleaning baths to remove mineral and animal fats or oils from the steel, and those rinsing...

  15. 40 CFR 420.110 - Applicability; description of the alkaline cleaning subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... alkaline cleaning subcategory. 420.110 Section 420.110 Protection of Environment ENVIRONMENTAL PROTECTION... Alkaline Cleaning Subcategory § 420.110 Applicability; description of the alkaline cleaning subcategory... alkaline cleaning baths to remove mineral and animal fats or oils from the steel, and those rinsing...

  16. Bio-Inspired Self-Cleaning Surfaces

    NASA Astrophysics Data System (ADS)

    Liu, Kesong; Jiang, Lei

    2012-08-01

    Self-cleaning surfaces have drawn a lot of interest for both fundamental research and practical applications. This review focuses on the recent progress in mechanism, preparation, and application of self-cleaning surfaces. To date, self-cleaning has been demonstrated by the following four conceptual approaches: (a) TiO2-based superhydrophilic self-cleaning, (b) lotus effect self-cleaning (superhydrophobicity with a small sliding angle), (c) gecko setae-inspired self-cleaning, and (d) underwater organisms-inspired antifouling self-cleaning. Although a number of self-cleaning products have been commercialized, the remaining challenges and future outlook of self-cleaning surfaces are also briefly addressed. Through evolution, nature, which has long been a source of inspiration for scientists and engineers, has arrived at what is optimal. We hope this review will stimulate interdisciplinary collaboration among material science, chemistry, biology, physics, nanoscience, engineering, etc., which is essential for the rational design and reproducible construction of bio-inspired multifunctional self-cleaning surfaces in practical applications.

  17. Identification of noise artifacts in searches for long-duration gravitational-wave transients

    NASA Astrophysics Data System (ADS)

    Prestegard, Tanner; Thrane, Eric; Christensen, Nelson L.; Coughlin, Michael W.; Hubbert, Ben; Kandhasamy, Shivaraj; MacAyeal, Evan; Mandic, Vuk

    2012-05-01

    We present an algorithm for the identification of transient noise artifacts (glitches) in cross-correlation searches for long gravitational-wave (GW) transients lasting seconds to weeks. The algorithm utilizes the auto-power in each detector as a discriminator between well-behaved stationary noise (possibly including a GW signal) and non-stationary noise transients. We test the algorithm with both Monte Carlo noise and time-shifted data from the LIGO S5 science run and find that it removes a significant fraction of glitches while keeping the vast majority (99.6%) of the data. We show that this cleaned data can be used to observe GW signals at a significantly lower amplitude than can otherwise be achieved. Using an accretion disk instability signal model, we estimate that the algorithm is accidentally triggered at a rate of less than 10-5% by realistic signals, and less than 3% even for exceptionally loud signals. We conclude that the algorithm is a safe and effective method for cleaning the cross-correlation data used in searches for long GW transients.

  18. Abort Gap Cleaning for LHC Run 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uythoven, Jan; Boccardi, Andrea; Bravin, Enrico

    2014-07-01

    To minimize the beam losses at the moment of an LHC beam dump the 3 μs long abort gap should contain as few particles as possible. Its population can be minimised by abort gap cleaning using the LHC transverse damper system. The LHC Run 1 experience is briefly recalled; changes foreseen for the LHC Run 2 are presented. They include improvements in the observation of the abort gap population and the mechanism to decide if cleaning is required, changes to the hardware of the transverse dampers to reduce the detrimental effect on the luminosity lifetime and proposed changes to themore » applied cleaning algorithms.« less

  19. Design and control of a macro-micro robot for precise force applications

    NASA Technical Reports Server (NTRS)

    Wang, Yulun; Mangaser, Amante; Laby, Keith; Jordan, Steve; Wilson, Jeff

    1993-01-01

    Creating a robot which can delicately interact with its environment has been the goal of much research. Primarily two difficulties have made this goal hard to attain. The execution of control strategies which enable precise force manipulations are difficult to implement in real time because such algorithms have been too computationally complex for available controllers. Also, a robot mechanism which can quickly and precisely execute a force command is difficult to design. Actuation joints must be sufficiently stiff, frictionless, and lightweight so that desired torques can be accurately applied. This paper describes a robotic system which is capable of delicate manipulations. A modular high-performance multiprocessor control system was designed to provide sufficient compute power for executing advanced control methods. An 8 degree of freedom macro-micro mechanism was constructed to enable accurate tip forces. Control algorithms based on the impedance control method were derived, coded, and load balanced for maximum execution speed on the multiprocessor system. Delicate force tasks such as polishing, finishing, cleaning, and deburring, are the target applications of the robot.

  20. A force-controllable macro-micro manipulator and its application to medical robots

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Uecker, Darrin R.; Wang, Yulun

    1994-01-01

    This paper describes an 8-degrees-of-freedom macro-micro robot. This robot is capable of performing tasks that require accurate force control, such as polishing, finishing, grinding, deburring, and cleaning. The design of the macro-micro mechanism, the control algorithms, and the hardware/software implementation of the algorithms are described in this paper. Initial experimental results are reported. In addition, this paper includes a discussion of medical surgery and the role that force control may play. We introduce a new class of robotic systems collectively called Robotic Enhancement Technology (RET). RET systems introduce the combination of robotic manipulation with human control to perform manipulation tasks beyond the individual capability of either human or machine. The RET class of robotic systems offers new challenges in mechanism design, control-law development, and man/machine interface design. We believe force-controllable mechanisms such as the macro-micro structure we have developed are a necessary part of RET. Work in progress in the area of RET systems and their application to minimally invasive surgery is presented, along with future research directions.

  1. 40 CFR 63.5734 - What standards must I meet for resin and gel coat application equipment cleaning operations?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...? (a) For routine flushing of resin and gel coat application equipment (e.g., spray guns, flowcoaters... and gel coat application equipment cleaning operations? 63.5734 Section 63.5734 Protection of... Pollutants for Boat Manufacturing Standards for Resin and Gel Coat Application Equipment Cleaning Operations...

  2. Applying a Wearable Voice-Activated Computer to Instructional Applications in Clean Room Environments

    NASA Technical Reports Server (NTRS)

    Graves, Corey A.; Lupisella, Mark L.

    2004-01-01

    The use of wearable computing technology in restrictive environments related to space applications offers promise in a number of domains. The clean room environment is one such domain in which hands-free, heads-up, wearable computing is particularly attractive for education and training because of the nature of clean room work We have developed and tested a Wearable Voice-Activated Computing (WEVAC) system based on clean room applications. Results of this initial proof-of-concept work indicate that there is a strong potential for WEVAC to enhance clean room activities.

  3. Noise Reduction with Microphone Arrays for Speaker Identification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Z

    Reducing acoustic noise in audio recordings is an ongoing problem that plagues many applications. This noise is hard to reduce because of interfering sources and non-stationary behavior of the overall background noise. Many single channel noise reduction algorithms exist but are limited in that the more the noise is reduced; the more the signal of interest is distorted due to the fact that the signal and noise overlap in frequency. Specifically acoustic background noise causes problems in the area of speaker identification. Recording a speaker in the presence of acoustic noise ultimately limits the performance and confidence of speaker identificationmore » algorithms. In situations where it is impossible to control the environment where the speech sample is taken, noise reduction filtering algorithms need to be developed to clean the recorded speech of background noise. Because single channel noise reduction algorithms would distort the speech signal, the overall challenge of this project was to see if spatial information provided by microphone arrays could be exploited to aid in speaker identification. The goals are: (1) Test the feasibility of using microphone arrays to reduce background noise in speech recordings; (2) Characterize and compare different multichannel noise reduction algorithms; (3) Provide recommendations for using these multichannel algorithms; and (4) Ultimately answer the question - Can the use of microphone arrays aid in speaker identification?« less

  4. Supersonic Gas-Liquid Cleaning System

    NASA Technical Reports Server (NTRS)

    Kinney, Frank

    1996-01-01

    The Supersonic Gas-Liquid Cleaning System Research Project consisted mainly of a feasibility study, including theoretical and engineering analysis, of a proof-of-concept prototype of this particular cleaning system developed by NASA-KSC. The cleaning system utilizes gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the device to be cleaned. The cleaning fluid being accelerated to these high velocities may consist of any solvent or liquid, including water. Compressed air or any inert gas is used to provide the conveying medium for the liquid, as well as substantially reduce the total amount of liquid needed to perform adequate surface cleaning and cleanliness verification. This type of aqueous cleaning system is considered to be an excellent way of conducting cleaning and cleanliness verification operations as replacements for the use of CFC 113 which must be discontinued by 1995. To utilize this particular cleaning system in various cleaning applications for both the Space Program and the commercial market, it is essential that the cleaning system, especially the supersonic nozzle, be characterized for such applications. This characterization consisted of performing theoretical and engineering analysis, identifying desirable modifications/extensions to the basic concept, evaluating effects of variations in operating parameters, and optimizing hardware design for specific applications.

  5. A Method for Semi-quantitative Assessment of Exposure to Pesticides of Applicators and Re-entry Workers: An Application in Three Farming Systems in Ethiopia.

    PubMed

    Negatu, Beyene; Vermeulen, Roel; Mekonnen, Yalemtshay; Kromhout, Hans

    2016-07-01

    To develop an inexpensive and easily adaptable semi-quantitative exposure assessment method to characterize exposure to pesticide in applicators and re-entry farmers and farm workers in Ethiopia. Two specific semi-quantitative exposure algorithms for pesticides applicators and re-entry workers were developed and applied to 601 farm workers employed in 3 distinctly different farming systems [small-scale irrigated, large-scale greenhouses (LSGH), and large-scale open (LSO)] in Ethiopia. The algorithm for applicators was based on exposure-modifying factors including application methods, farm layout (open or closed), pesticide mixing conditions, cleaning of spraying equipment, intensity of pesticide application per day, utilization of personal protective equipment (PPE), personal hygienic behavior, annual frequency of application, and duration of employment at the farm. The algorithm for re-entry work was based on an expert-based re-entry exposure intensity score, utilization of PPE, personal hygienic behavior, annual frequency of re-entry work, and duration of employment at the farm. The algorithms allowed estimation of daily, annual and cumulative lifetime exposure for applicators, and re-entry workers by farming system, by gender, and by age group. For all metrics, highest exposures occurred in LSGH for both applicators and female re-entry workers. For male re-entry workers, highest cumulative exposure occurred in LSO farms. Female re-entry workers appeared to be higher exposed on a daily or annual basis than male re-entry workers, but their cumulative exposures were similar due to the fact that on average males had longer tenure. Factors related to intensity of exposure (like application method and farm layout) were indicated as the main driving factors for estimated potential exposure. Use of personal protection, hygienic behavior, and duration of employment in surveyed farm workers contributed less to the contrast in exposure estimates. This study indicated that farmers' and farm workers' exposure to pesticides can be inexpensively characterized, ranked, and classified. Our method could be extended to assess exposure to specific active ingredients provided that detailed information on pesticides used is available. The resulting exposure estimates will consequently be used in occupational epidemiology studies in Ethiopia and other similar countries with few resources. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  6. Constraint-Based Local Search for Constrained Optimum Paths Problems

    NASA Astrophysics Data System (ADS)

    Pham, Quang Dung; Deville, Yves; van Hentenryck, Pascal

    Constrained Optimum Path (COP) problems arise in many real-life applications and are ubiquitous in communication networks. They have been traditionally approached by dedicated algorithms, which are often hard to extend with side constraints and to apply widely. This paper proposes a constraint-based local search (CBLS) framework for COP applications, bringing the compositionality, reuse, and extensibility at the core of CBLS and CP systems. The modeling contribution is the ability to express compositional models for various COP applications at a high level of abstraction, while cleanly separating the model and the search procedure. The main technical contribution is a connected neighborhood based on rooted spanning trees to find high-quality solutions to COP problems. The framework, implemented in COMET, is applied to Resource Constrained Shortest Path (RCSP) problems (with and without side constraints) and to the edge-disjoint paths problem (EDP). Computational results show the potential significance of the approach.

  7. DMSP SSJ4 Data Restoration, Classification, and On-Line Data Access

    NASA Technical Reports Server (NTRS)

    Wing, Simon; Bredekamp, Joseph H. (Technical Monitor)

    2000-01-01

    Compress and clean raw data file for permanent storage We have identified various error conditions/types and developed algorithms to get rid of these errors/noises, including the more complicated noise in the newer data sets. (status = 100% complete). Internet access of compacted raw data. It is now possible to access the raw data via our web site, http://www.jhuapl.edu/Aurora/index.html. The software to read and plot the compacted raw data is also available from the same web site. The users can now download the raw data, read, plot, or manipulate the data as they wish on their own computer. The users are able to access the cleaned data sets. Internet access of the color spectrograms. This task has also been completed. It is now possible to access the spectrograms from the web site mentioned above. Improve the particle precipitation region classification. The algorithm for doing this task has been developed and implemented. As a result, the accuracies improved. Now the web site routinely distributes the results of applying the new algorithm to the cleaned data set. Mark the classification region on the spectrograms. The software to mark the classification region in the spectrograms has been completed. This is also available from our web site.

  8. Correlating objective and subjective evaluation of texture appearance with applications to camera phone imaging

    NASA Astrophysics Data System (ADS)

    Phillips, Jonathan B.; Coppola, Stephen M.; Jin, Elaine W.; Chen, Ying; Clark, James H.; Mauer, Timothy A.

    2009-01-01

    Texture appearance is an important component of photographic image quality as well as object recognition. Noise cleaning algorithms are used to decrease sensor noise of digital images, but can hinder texture elements in the process. The Camera Phone Image Quality (CPIQ) initiative of the International Imaging Industry Association (I3A) is developing metrics to quantify texture appearance. Objective and subjective experimental results of the texture metric development are presented in this paper. Eight levels of noise cleaning were applied to ten photographic scenes that included texture elements such as faces, landscapes, architecture, and foliage. Four companies (Aptina Imaging, LLC, Hewlett-Packard, Eastman Kodak Company, and Vista Point Technologies) have performed psychophysical evaluations of overall image quality using one of two methods of evaluation. Both methods presented paired comparisons of images on thin film transistor liquid crystal displays (TFT-LCD), but the display pixel pitch and viewing distance differed. CPIQ has also been developing objective texture metrics and targets that were used to analyze the same eight levels of noise cleaning. The correlation of the subjective and objective test results indicates that texture perception can be modeled with an objective metric. The two methods of psychophysical evaluation exhibited high correlation despite the differences in methodology.

  9. SVM classification of microaneurysms with imbalanced dataset based on borderline-SMOTE and data cleaning techniques

    NASA Astrophysics Data System (ADS)

    Wang, Qingjie; Xin, Jingmin; Wu, Jiayi; Zheng, Nanning

    2017-03-01

    Microaneurysms are the earliest clinic signs of diabetic retinopathy, and many algorithms were developed for the automatic classification of these specific pathology. However, the imbalanced class distribution of dataset usually causes the classification accuracy of true microaneurysms be low. Therefore, by combining the borderline synthetic minority over-sampling technique (BSMOTE) with the data cleaning techniques such as Tomek links and Wilson's edited nearest neighbor rule (ENN) to resample the imbalanced dataset, we propose two new support vector machine (SVM) classification algorithms for the microaneurysms. The proposed BSMOTE-Tomek and BSMOTE-ENN algorithms consist of: 1) the adaptive synthesis of the minority samples in the neighborhood of the borderline, and 2) the remove of redundant training samples for improving the efficiency of data utilization. Moreover, the modified SVM classifier with probabilistic outputs is used to divide the microaneurysm candidates into two groups: true microaneurysms and false microaneurysms. The experiments with a public microaneurysms database shows that the proposed algorithms have better classification performance including the receiver operating characteristic (ROC) curve and the free-response receiver operating characteristic (FROC) curve.

  10. Research progress of nano self - cleaning anti-fouling coatings

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhao, Y. J.; Teng, J. L.; Wang, J. H.; Wu, L. S.; Zheng, Y. L.

    2018-01-01

    There are many methods of evaluating the performance of nano self-cleaning anti-fouling coatings, such as carbon blacking method, coating reflection coefficient method, glass microbead method, film method, contact angle and rolling angle method, organic degradation method, and the application of performance evaluation method in self-cleaning antifouling coating. For the more, the types of nano self-cleaning anti-fouling coatings based on aqueous media was described, such as photocatalytic self-cleaning coatings, silicone coatings, organic fluorine coatings, fluorosilicone coatings, fluorocarbon coatings, polysilazane self-cleaning coatings. The research and application of different kinds of nano self-cleaning antifouling coatings are anlysised, and the latest research results are summed.

  11. Results Of Automating A Photolithography Cell In A Clean Tunnel

    NASA Astrophysics Data System (ADS)

    June, David H.

    1987-01-01

    A prototype automated photobay was installed in an existing fab area utilizing flexible material handling techniques within a clean tunnel. The project objective was to prove design concepts of automated cassette-to-cassette handling within a clean tunnel that isolated operators from the wafers being processed. Material handling was by monorail track transport system to feed cassettes to pick and place robots. The robots loaded and unloaded cassettes of wafers to each of the various pieces of process equipment. The material handling algorithms, recipe downloading and statistical process control functions were all performed by custom software on the photobay cell controller.

  12. Recognition of plant parts with problem-specific algorithms

    NASA Astrophysics Data System (ADS)

    Schwanke, Joerg; Brendel, Thorsten; Jensch, Peter F.; Megnet, Roland

    1994-06-01

    Automatic micropropagation is necessary to produce cost-effective high amounts of biomass. Juvenile plants are dissected in clean- room environment on particular points on the stem or the leaves. A vision-system detects possible cutting points and controls a specialized robot. This contribution is directed to the pattern- recognition algorithms to detect structural parts of the plant.

  13. Examining factors that influence the effectiveness of cleaning antineoplastic drugs from drug preparation surfaces: a pilot study.

    PubMed

    Hon, Chun-Yip; Chua, Prescillia Ps; Danyluk, Quinn; Astrakianakis, George

    2014-06-01

    Occupational exposure to antineoplastic drugs has been documented to result in various adverse health effects. Despite the implementation of control measures to minimize exposure, detectable levels of drug residual are still found on hospital work surfaces. Cleaning these surfaces is considered as one means to minimize the exposure potential. However, there are no consistent guiding principles related to cleaning of contaminated surfaces resulting in hospitals to adopt varying practices. As such, this pilot study sought to evaluate current cleaning protocols and identify those factors that were most effective in reducing contamination on drug preparation surfaces. Three cleaning variables were examined: (1) type of cleaning agent (CaviCide®, Phenokil II™, bleach and chlorhexidine), (2) application method of cleaning agent (directly onto surface or indirectly onto a wipe) and (3) use of isopropyl alcohol after cleaning agent application. Known concentrations of antineoplastic drugs (either methotrexate or cyclophosphamide) were placed on a stainless steel swatch and then, systematically, each of the three cleaning variables was tested. Surface wipes were collected and quantified using high-performance liquid chromatography-tandem mass spectrometry to determine the percent residual of drug remaining (with 100% being complete elimination of the drug). No one single cleaning agent proved to be effective in completely eliminating all drug contamination. The method of application had minimal effect on the amount of drug residual. In general, application of isopropyl alcohol after the use of cleaning agent further reduced the level of drug contamination although measureable levels of drug were still found in some cases.

  14. A CLEAN-based method for mosaic deconvolution

    NASA Astrophysics Data System (ADS)

    Gueth, F.; Guilloteau, S.; Viallefond, F.

    1995-03-01

    Mosaicing may be used in aperture synthesis to map large fields of view. So far, only MEM techniques have been used to deconvolve mosaic images (Cornwell (1988)). A CLEAN-based method has been developed, in which the components are located in a modified expression. This allows a better utilization of the information and consequent noise reduction in the overlapping regions. Simulations show that this method gives correct clean maps and recovers most of the flux of the sources. The introduction of the short-spacing visibilities in the data set is strongly required. Their absence actually introduces artificial lack of structures on the corresponding scale in the mosaic images. The formation of ``stripes'' in clean maps may also occur, but this phenomenon can be significantly reduced by using the Steer-Dewdney-Ito algorithm (Steer, Dewdney & Ito (1984)) to identify the CLEAN components. Typical IRAM interferometer pointing errors do not have a significant effect on the reconstructed images.

  15. A macro-micro robot for precise force applications

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Wang, Yulun

    1993-01-01

    This paper describes an 8 degree-of-freedom macro-micro robot capable of performing tasks which require accurate force control. Applications such as polishing, finishing, grinding, deburring, and cleaning are a few examples of tasks which need this capability. Currently these tasks are either performed manually or with dedicated machinery because of the lack of a flexible and cost effective tool, such as a programmable force-controlled robot. The basic design and control of the macro-micro robot is described in this paper. A modular high-performance multiprocessor control system was designed to provide sufficient compute power for executing advanced control methods. An 8 degree of freedom macro-micro mechanism was constructed to enable accurate tip forces. Control algorithms based on the impedance control method were derived, coded, and load balanced for maximum execution speed on the multiprocessor system.

  16. Ensemble machine learning and forecasting can achieve 99% uptime for rural handpumps

    PubMed Central

    Thomas, Evan A.

    2017-01-01

    Broken water pumps continue to impede efforts to deliver clean and economically-viable water to the global poor. The literature has demonstrated that customers’ health benefits and willingness to pay for clean water are best realized when clean water infrastructure performs extremely well (>99% uptime). In this paper, we used sensor data from 42 Afridev-brand handpumps observed for 14 months in western Kenya to demonstrate how sensors and supervised ensemble machine learning could be used to increase total fleet uptime from a best-practices baseline of about 70% to >99%. We accomplish this increase in uptime by forecasting pump failures and identifying existing failures very quickly. Comparing the costs of operating the pump per functional year over a lifetime of 10 years, we estimate that implementing this algorithm would save 7% on the levelized cost of water relative to a sensor-less scheduled maintenance program. Combined with a rigorous system for dispatching maintenance personnel, implementing this algorithm in a real-world program could significantly improve health outcomes and customers’ willingness to pay for water services. PMID:29182673

  17. Applicability Determination Letters for 40 C.F.R. Part 63 Subpart M, National Perchloroethylene Air Emission Standards for Dry Cleaning Facilities

    EPA Pesticide Factsheets

    This pages contains two letters on the applicability of the National Perchloroethylene Air Emission Standards for Dry Cleaning Facilities (40 CFR 63, Subpart M). Both letters clarify what constitutes instillation of a dry cleaning machine.

  18. Efficient and Scalable Graph Similarity Joins in MapReduce

    PubMed Central

    Chen, Yifan; Zhang, Weiming; Tang, Jiuyang

    2014-01-01

    Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results. PMID:25121135

  19. Efficient and scalable graph similarity joins in MapReduce.

    PubMed

    Chen, Yifan; Zhao, Xiang; Xiao, Chuan; Zhang, Weiming; Tang, Jiuyang

    2014-01-01

    Along with the emergence of massive graph-modeled data, it is of great importance to investigate graph similarity joins due to their wide applications for multiple purposes, including data cleaning, and near duplicate detection. This paper considers graph similarity joins with edit distance constraints, which return pairs of graphs such that their edit distances are no larger than a given threshold. Leveraging the MapReduce programming model, we propose MGSJoin, a scalable algorithm following the filtering-verification framework for efficient graph similarity joins. It relies on counting overlapping graph signatures for filtering out nonpromising candidates. With the potential issue of too many key-value pairs in the filtering phase, spectral Bloom filters are introduced to reduce the number of key-value pairs. Furthermore, we integrate the multiway join strategy to boost the verification, where a MapReduce-based method is proposed for GED calculation. The superior efficiency and scalability of the proposed algorithms are demonstrated by extensive experimental results.

  20. Kernel Methods for Mining Instance Data in Ontologies

    NASA Astrophysics Data System (ADS)

    Bloehdorn, Stephan; Sure, York

    The amount of ontologies and meta data available on the Web is constantly growing. The successful application of machine learning techniques for learning of ontologies from textual data, i.e. mining for the Semantic Web, contributes to this trend. However, no principal approaches exist so far for mining from the Semantic Web. We investigate how machine learning algorithms can be made amenable for directly taking advantage of the rich knowledge expressed in ontologies and associated instance data. Kernel methods have been successfully employed in various learning tasks and provide a clean framework for interfacing between non-vectorial data and machine learning algorithms. In this spirit, we express the problem of mining instances in ontologies as the problem of defining valid corresponding kernels. We present a principled framework for designing such kernels by means of decomposing the kernel computation into specialized kernels for selected characteristics of an ontology which can be flexibly assembled and tuned. Initial experiments on real world Semantic Web data enjoy promising results and show the usefulness of our approach.

  1. Investigation of image enhancement techniques for the development of a self-contained airborne radar navigation system

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.; Karmali, M. S.

    1983-01-01

    This study was devoted to an investigation of the feasibility of applying advanced image processing techniques to enhance radar image characteristics that are pertinent to the pilot's navigation and guidance task. Millimeter (95 GHz) wave radar images for the overwater (i.e., offshore oil rigs) and overland (Heliport) scenario were used as a data base. The purpose of the study was to determine the applicability of image enhancement and scene analysis algorithms to detect and improve target characteristics (i.e., manmade objects such as buildings, parking lots, cars, roads, helicopters, towers, landing pads, etc.) that would be helpful to the pilot in determining his own position/orientation with respect to the outside world and assist him in the navigation task. Results of this study show that significant improvements in the raw radar image may be obtained using two dimensional image processing algorithms. In the overwater case, it is possible to remove the ocean clutter by thresholding the image data, and furthermore to extract the target boundary as well as the tower and catwalk locations using noise cleaning (e.g., median filter) and edge detection (e.g., Sobel operator) algorithms.

  2. Comparative study between chemical and atmospheric pressure plasma jet cleaning on glass substrate

    NASA Astrophysics Data System (ADS)

    Elfa, Rizan Rizon; Ahmad, Mohd Khairul; Fhong, Soon Chin; Sahdan, Mohd Zainizan; Nayan, Nafarizal

    2017-01-01

    The atmospheric pressure plasma jet with low frequency and argon as working gas is presented in this paper to demonstrate its application for glass substrate clean and modification. The glass substrate clean by atmospheric pressure plasma jet is an efficient method to replace other substrate clean method. A comparative analysis is done in this paper between substrate cleaned by chemical and plasma treatment methods. Water contact angle reading is taken for a different method of substrate clean and period of treatment. Under the plasma treatment, the sample shows low surface adhesion due to having the surface property of super hydrophilic surface 7.26°. This comparative analysis is necessary in the industrial application for cost production due to sufficient time and method of substrate clean.

  3. ARTIST: A fully automated artifact rejection algorithm for single-pulse TMS-EEG data.

    PubMed

    Wu, Wei; Keller, Corey J; Rogasch, Nigel C; Longwell, Parker; Shpigel, Emmanuel; Rolle, Camarin E; Etkin, Amit

    2018-04-01

    Concurrent single-pulse TMS-EEG (spTMS-EEG) is an emerging noninvasive tool for probing causal brain dynamics in humans. However, in addition to the common artifacts in standard EEG data, spTMS-EEG data suffer from enormous stimulation-induced artifacts, posing significant challenges to the extraction of neural information. Typically, neural signals are analyzed after a manual time-intensive and often subjective process of artifact rejection. Here we describe a fully automated algorithm for spTMS-EEG artifact rejection. A key step of this algorithm is to decompose the spTMS-EEG data into statistically independent components (ICs), and then train a pattern classifier to automatically identify artifact components based on knowledge of the spatio-temporal profile of both neural and artefactual activities. The autocleaned and hand-cleaned data yield qualitatively similar group evoked potential waveforms. The algorithm achieves a 95% IC classification accuracy referenced to expert artifact rejection performance, and does so across a large number of spTMS-EEG data sets (n = 90 stimulation sites), retains high accuracy across stimulation sites/subjects/populations/montages, and outperforms current automated algorithms. Moreover, the algorithm was superior to the artifact rejection performance of relatively novice individuals, who would be the likely users of spTMS-EEG as the technique becomes more broadly disseminated. In summary, our algorithm provides an automated, fast, objective, and accurate method for cleaning spTMS-EEG data, which can increase the utility of TMS-EEG in both clinical and basic neuroscience settings. © 2018 Wiley Periodicals, Inc.

  4. Robust autofocus algorithm for ISAR imaging of moving targets

    NASA Astrophysics Data System (ADS)

    Li, Jian; Wu, Renbiao; Chen, Victor C.

    2000-08-01

    A robust autofocus approach, referred to as AUTOCLEAN (AUTOfocus via CLEAN), is proposed for the motion compensation in ISAR (inverse synthetic aperture radar) imaging of moving targets. It is a parametric algorithm based on a very flexible data model which takes into account arbitrary range migration and arbitrary phase errors across the synthetic aperture that may be induced by unwanted radial motion of the target as well as propagation or system instability. AUTOCLEAN can be classified as a multiple scatterer algorithm (MSA), but it differs considerably from other existing MSAs in several aspects: (1) dominant scatterers are selected automatically in the two-dimensional (2-D) image domain; (2) scatterers may not be well-isolated or very dominant; (3) phase and RCS (radar cross section) information from each selected scatterer are combined in an optimal way; (4) the troublesome phase unwrapping step is avoided. AUTOCLEAN is computationally efficient and involves only a sequence of FFTs (fast Fourier Transforms). Another good feature associated with AUTOCLEAN is that its performance can be progressively improved by assuming a larger number of dominant scatterers for the target. Hence it can be easily configured for real-time applications including, for example, ATR (automatic target recognition) of non-cooperative moving targets, and for some other applications where the image quality is of the major concern but not the computational time including, for example, for the development and maintenance of low observable aircrafts. Numerical and experimental results have shown that AUTOCLEAN is a very robust autofocus tool for ISAR imaging.

  5. [Evaluation of Medical Instruments Cleaning Effect of Fluorescence Detection Technique].

    PubMed

    Sheng, Nan; Shen, Yue; Li, Zhen; Li, Huijuan; Zhou, Chaoqun

    2016-01-01

    To compare the cleaning effect of automatic cleaning machine and manual cleaning on coupling type surgical instruments. A total of 32 cleaned medical instruments were randomly sampled from medical institutions in Putuo District medical institutions disinfection supply center. Hygiena System SUREII ATP was used to monitor the ATP value, and the cleaning effect was evaluated. The surface ATP values of the medical instrument of manual cleaning were higher than that of the automatic cleaning machine. Coupling type surgical instruments has better cleaning effect of automatic cleaning machine before disinfection, the application is recommended.

  6. Final Rule: Definition of “Waters of the United States” – Addition of Applicability Date to 2015 Clean Water Rule

    EPA Pesticide Factsheets

    Link to the final rule of the applicability date of the clean water rule, The 2015 Rule will not be applicable until two years following publication of the applicability date rule in the Federal Register.

  7. Development of a validated algorithm for the diagnosis of paediatric asthma in electronic medical records

    PubMed Central

    Cave, Andrew J; Davey, Christina; Ahmadi, Elaheh; Drummond, Neil; Fuentes, Sonia; Kazemi-Bajestani, Seyyed Mohammad Reza; Sharpe, Heather; Taylor, Matt

    2016-01-01

    An accurate estimation of the prevalence of paediatric asthma in Alberta and elsewhere is hampered by uncertainty regarding disease definition and diagnosis. Electronic medical records (EMRs) provide a rich source of clinical data from primary-care practices that can be used in better understanding the occurrence of the disease. The Canadian Primary Care Sentinel Surveillance Network (CPCSSN) database includes cleaned data extracted from the EMRs of primary-care practitioners. The purpose of the study was to develop and validate a case definition of asthma in children 1–17 who consult family physicians, in order to provide primary-care estimates of childhood asthma in Alberta as accurately as possible. The validation involved the comparison of the application of a theoretical algorithm (to identify patients with asthma) to a physician review of records included in the CPCSSN database (to confirm an accurate diagnosis). The comparison yielded 87.4% sensitivity, 98.6% specificity and a positive and negative predictive value of 91.2% and 97.9%, respectively, in the age group 1–17 years. The algorithm was also run for ages 3–17 and 6–17 years, and was found to have comparable statistical values. Overall, the case definition and algorithm yielded strong sensitivity and specificity metrics and was found valid for use in research in CPCSSN primary-care practices. The use of the validated asthma algorithm may improve insight into the prevalence, diagnosis, and management of paediatric asthma in Alberta and Canada. PMID:27882997

  8. Development of a validated algorithm for the diagnosis of paediatric asthma in electronic medical records.

    PubMed

    Cave, Andrew J; Davey, Christina; Ahmadi, Elaheh; Drummond, Neil; Fuentes, Sonia; Kazemi-Bajestani, Seyyed Mohammad Reza; Sharpe, Heather; Taylor, Matt

    2016-11-24

    An accurate estimation of the prevalence of paediatric asthma in Alberta and elsewhere is hampered by uncertainty regarding disease definition and diagnosis. Electronic medical records (EMRs) provide a rich source of clinical data from primary-care practices that can be used in better understanding the occurrence of the disease. The Canadian Primary Care Sentinel Surveillance Network (CPCSSN) database includes cleaned data extracted from the EMRs of primary-care practitioners. The purpose of the study was to develop and validate a case definition of asthma in children 1-17 who consult family physicians, in order to provide primary-care estimates of childhood asthma in Alberta as accurately as possible. The validation involved the comparison of the application of a theoretical algorithm (to identify patients with asthma) to a physician review of records included in the CPCSSN database (to confirm an accurate diagnosis). The comparison yielded 87.4% sensitivity, 98.6% specificity and a positive and negative predictive value of 91.2% and 97.9%, respectively, in the age group 1-17 years. The algorithm was also run for ages 3-17 and 6-17 years, and was found to have comparable statistical values. Overall, the case definition and algorithm yielded strong sensitivity and specificity metrics and was found valid for use in research in CPCSSN primary-care practices. The use of the validated asthma algorithm may improve insight into the prevalence, diagnosis, and management of paediatric asthma in Alberta and Canada.

  9. 75 FR 9181 - Secretarial Indonesia Clean Energy Business Development Mission: Application Deadline Extended

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-01

    ... DEPARTMENT OF COMMERCE International Trade Administration Secretarial Indonesia Clean Energy Business Development Mission: Application Deadline Extended AGENCY: International Trade Administration, Department of Commerce. ACTION: Notice. Timeframe for Recruitment and Applications Mission recruitment will...

  10. Maximum power point tracking for photovoltaic applications by using two-level DC/DC boost converter

    NASA Astrophysics Data System (ADS)

    Moamaei, Parvin

    Recently, photovoltaic (PV) generation is becoming increasingly popular in industrial applications. As a renewable and alternative source of energy they feature superior characteristics such as being clean and silent along with less maintenance problems compared to other sources of the energy. In PV generation, employing a Maximum Power Point Tracking (MPPT) method is essential to obtain the maximum available solar energy. Among several proposed MPPT techniques, the Perturbation and Observation (P&O;) and Model Predictive Control (MPC) methods are adopted in this work. The components of the MPPT control system which are P&O; and MPC algorithms, PV module and high gain DC-DC boost converter are simulated in MATLAB Simulink. They are evaluated theoretically under rapidly and slowly changing of solar irradiation and temperature and their performance is shown by the simulation results, finally a comprehensive comparison is presented.

  11. A new computer approach to mixed feature classification for forestry application

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1976-01-01

    A computer approach for mapping mixed forest features (i.e., types, classes) from computer classification maps is discussed. Mixed features such as mixed softwood/hardwood stands are treated as admixtures of softwood and hardwood areas. Large-area mixed features are identified and small-area features neglected when the nominal size of a mixed feature can be specified. The computer program merges small isolated areas into surrounding areas by the iterative manipulation of the postprocessing algorithm that eliminates small connected sets. For a forestry application, computer-classified LANDSAT multispectral scanner data of the Sam Houston National Forest were used to demonstrate the proposed approach. The technique was successful in cleaning the salt-and-pepper appearance of multiclass classification maps and in mapping admixtures of softwood areas and hardwood areas. However, the computer-mapped mixed areas matched very poorly with the ground truth because of inadequate resolution and inappropriate definition of mixed features.

  12. Design and implementation of a privacy preserving electronic health record linkage tool in Chicago

    PubMed Central

    Cashy, John P; Jackson, Kathryn L; Pah, Adam R; Goel, Satyender; Boehnke, Jörn; Humphries, John Eric; Kominers, Scott Duke; Hota, Bala N; Sims, Shannon A; Malin, Bradley A; French, Dustin D; Walunas, Theresa L; Meltzer, David O; Kaleba, Erin O; Jones, Roderick C; Galanter, William L

    2015-01-01

    Objective To design and implement a tool that creates a secure, privacy preserving linkage of electronic health record (EHR) data across multiple sites in a large metropolitan area in the United States (Chicago, IL), for use in clinical research. Methods The authors developed and distributed a software application that performs standardized data cleaning, preprocessing, and hashing of patient identifiers to remove all protected health information. The application creates seeded hash code combinations of patient identifiers using a Health Insurance Portability and Accountability Act compliant SHA-512 algorithm that minimizes re-identification risk. The authors subsequently linked individual records using a central honest broker with an algorithm that assigns weights to hash combinations in order to generate high specificity matches. Results The software application successfully linked and de-duplicated 7 million records across 6 institutions, resulting in a cohort of 5 million unique records. Using a manually reconciled set of 11 292 patients as a gold standard, the software achieved a sensitivity of 96% and a specificity of 100%, with a majority of the missed matches accounted for by patients with both a missing social security number and last name change. Using 3 disease examples, it is demonstrated that the software can reduce duplication of patient records across sites by as much as 28%. Conclusions Software that standardizes the assignment of a unique seeded hash identifier merged through an agreed upon third-party honest broker can enable large-scale secure linkage of EHR data for epidemiologic and public health research. The software algorithm can improve future epidemiologic research by providing more comprehensive data given that patients may make use of multiple healthcare systems. PMID:26104741

  13. Design and implementation of a privacy preserving electronic health record linkage tool in Chicago.

    PubMed

    Kho, Abel N; Cashy, John P; Jackson, Kathryn L; Pah, Adam R; Goel, Satyender; Boehnke, Jörn; Humphries, John Eric; Kominers, Scott Duke; Hota, Bala N; Sims, Shannon A; Malin, Bradley A; French, Dustin D; Walunas, Theresa L; Meltzer, David O; Kaleba, Erin O; Jones, Roderick C; Galanter, William L

    2015-09-01

    To design and implement a tool that creates a secure, privacy preserving linkage of electronic health record (EHR) data across multiple sites in a large metropolitan area in the United States (Chicago, IL), for use in clinical research. The authors developed and distributed a software application that performs standardized data cleaning, preprocessing, and hashing of patient identifiers to remove all protected health information. The application creates seeded hash code combinations of patient identifiers using a Health Insurance Portability and Accountability Act compliant SHA-512 algorithm that minimizes re-identification risk. The authors subsequently linked individual records using a central honest broker with an algorithm that assigns weights to hash combinations in order to generate high specificity matches. The software application successfully linked and de-duplicated 7 million records across 6 institutions, resulting in a cohort of 5 million unique records. Using a manually reconciled set of 11 292 patients as a gold standard, the software achieved a sensitivity of 96% and a specificity of 100%, with a majority of the missed matches accounted for by patients with both a missing social security number and last name change. Using 3 disease examples, it is demonstrated that the software can reduce duplication of patient records across sites by as much as 28%. Software that standardizes the assignment of a unique seeded hash identifier merged through an agreed upon third-party honest broker can enable large-scale secure linkage of EHR data for epidemiologic and public health research. The software algorithm can improve future epidemiologic research by providing more comprehensive data given that patients may make use of multiple healthcare systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Depth-resolved analytical model and correction algorithm for photothermal optical coherence tomography

    PubMed Central

    Lapierre-Landry, Maryse; Tucker-Schwartz, Jason M.; Skala, Melissa C.

    2016-01-01

    Photothermal OCT (PT-OCT) is an emerging molecular imaging technique that occupies a spatial imaging regime between microscopy and whole body imaging. PT-OCT would benefit from a theoretical model to optimize imaging parameters and test image processing algorithms. We propose the first analytical PT-OCT model to replicate an experimental A-scan in homogeneous and layered samples. We also propose the PT-CLEAN algorithm to reduce phase-accumulation and shadowing, two artifacts found in PT-OCT images, and demonstrate it on phantoms and in vivo mouse tumors. PMID:27446693

  15. Validation of new satellite aerosol optical depth retrieval algorithm using Raman lidar observations at radiative transfer laboratory in Warsaw

    NASA Astrophysics Data System (ADS)

    Zawadzka, Olga; Stachlewska, Iwona S.; Markowicz, Krzysztof M.; Nemuc, Anca; Stebel, Kerstin

    2018-04-01

    During an exceptionally warm September of 2016, the unique, stable weather conditions over Poland allowed for an extensive testing of the new algorithm developed to improve the Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI) aerosol optical depth (AOD) retrieval. The development was conducted in the frame of the ESA-ESRIN SAMIRA project. The new AOD algorithm aims at providing the aerosol optical depth maps over the territory of Poland with a high temporal resolution of 15 minutes. It was tested on the data set obtained between 11-16 September 2016, during which a day of relatively clean atmospheric background related to an Arctic airmass inflow was surrounded by a few days with well increased aerosol load of different origin. On the clean reference day, for estimating surface reflectance the AOD forecast available on-line via the Copernicus Atmosphere Monitoring Service (CAMS) was used. The obtained AOD maps were validated against AODs available within the Poland-AOD and AERONET networks, and with AOD values obtained from the PollyXT-UW lidar. of the University of Warsaw (UW).

  16. Graphene-based room-temperature implementation of a modified Deutsch-Jozsa quantum algorithm.

    PubMed

    Dragoman, Daniela; Dragoman, Mircea

    2015-12-04

    We present an implementation of a one-qubit and two-qubit modified Deutsch-Jozsa quantum algorithm based on graphene ballistic devices working at room temperature. The modified Deutsch-Jozsa algorithm decides whether a function, equivalent to the effect of an energy potential distribution on the wave function of ballistic charge carriers, is constant or not, without measuring the output wave function. The function need not be Boolean. Simulations confirm that the algorithm works properly, opening the way toward quantum computing at room temperature based on the same clean-room technologies as those used for fabrication of very-large-scale integrated circuits.

  17. 75 FR 9181 - Secretarial China Clean Energy Business Development Mission; Application Deadline Extended

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-01

    ... DEPARTMENT OF COMMERCE International Trade Administration Secretarial China Clean Energy Business Development Mission; Application Deadline Extended AGENCY: International Trade Administration, Department of... (202-482-1360 or [email protected] ). The application deadline has been extended to Friday...

  18. ROBIN: a platform for evaluating automatic target recognition algorithms: I. Overview of the project and presentation of the SAGEM DS competition

    NASA Astrophysics Data System (ADS)

    Duclos, D.; Lonnoy, J.; Guillerm, Q.; Jurie, F.; Herbin, S.; D'Angelo, E.

    2008-04-01

    The last five years have seen a renewal of Automatic Target Recognition applications, mainly because of the latest advances in machine learning techniques. In this context, large collections of image datasets are essential for training algorithms as well as for their evaluation. Indeed, the recent proliferation of recognition algorithms, generally applied to slightly different problems, make their comparisons through clean evaluation campaigns necessary. The ROBIN project tries to fulfil these two needs by putting unclassified datasets, ground truths, competitions and metrics for the evaluation of ATR algorithms at the disposition of the scientific community. The scope of this project includes single and multi-class generic target detection and generic target recognition, in military and security contexts. From our knowledge, it is the first time that a database of this importance (several hundred thousands of visible and infrared hand annotated images) has been publicly released. Funded by the French Ministry of Defence (DGA) and by the French Ministry of Research, ROBIN is one of the ten Techno-vision projects. Techno-vision is a large and ambitious government initiative for building evaluation means for computer vision technologies, for various application contexts. ROBIN's consortium includes major companies and research centres involved in Computer Vision R&D in the field of defence: Bertin Technologies, CNES, ECA, DGA, EADS, INRIA, ONERA, MBDA, SAGEM, THALES. This paper, which first gives an overview of the whole project, is focused on one of ROBIN's key competitions, the SAGEM Defence Security database. This dataset contains more than eight hundred ground and aerial infrared images of six different vehicles in cluttered scenes including distracters. Two different sets of data are available for each target. The first set includes different views of each vehicle at close range in a "simple" background, and can be used to train algorithms. The second set contains many views of the same vehicle in different contexts and situations simulating operational scenarios.

  19. 40 CFR 1068.110 - What other provisions apply to engines/equipment in service?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... information regarding test programs, engineering evaluations, design specifications, calibrations, on-board computer algorithms, and design strategies. It is a violation of the Clean Air Act for anyone to make...

  20. 40 CFR 1068.110 - What other provisions apply to engines/equipment in service?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... information regarding test programs, engineering evaluations, design specifications, calibrations, on-board computer algorithms, and design strategies. It is a violation of the Clean Air Act for anyone to make...

  1. 40 CFR 1068.110 - What other provisions apply to engines/equipment in service?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... information regarding test programs, engineering evaluations, design specifications, calibrations, on-board computer algorithms, and design strategies. It is a violation of the Clean Air Act for anyone to make...

  2. Development and Testing of Geo-Processing Models for the Automatic Generation of Remediation Plan and Navigation Data to Use in Industrial Disaster Remediation

    NASA Astrophysics Data System (ADS)

    Lucas, G.; Lénárt, C.; Solymosi, J.

    2015-08-01

    This paper introduces research done on the automatic preparation of remediation plans and navigation data for the precise guidance of heavy machinery in clean-up work after an industrial disaster. The input test data consists of a pollution extent shapefile derived from the processing of hyperspectral aerial survey data from the Kolontár red mud disaster. Three algorithms were developed and the respective scripts were written in Python. The first model aims at drawing a parcel clean-up plan. The model tests four different parcel orientations (0, 90, 45 and 135 degree) and keeps the plan where clean-up parcels are less numerous considering it is an optimal spatial configuration. The second model drifts the clean-up parcel of a work plan both vertically and horizontally following a grid pattern with sampling distance of a fifth of a parcel width and keep the most optimal drifted version; here also with the belief to reduce the final number of parcel features. The last model aims at drawing a navigation line in the middle of each clean-up parcel. The models work efficiently and achieve automatic optimized plan generation (parcels and navigation lines). Applying the first model we demonstrated that depending on the size and geometry of the features of the contaminated area layer, the number of clean-up parcels generated by the model varies in a range of 4% to 38% from plan to plan. Such a significant variation with the resulting feature numbers shows that the optimal orientation identification can result in saving work, time and money in remediation. The various tests demonstrated that the model gains efficiency when 1/ the individual features of contaminated area present a significant orientation with their geometry (features are long), 2/ the size of pollution extent features becomes closer to the size of the parcels (scale effect). The second model shows only 1% difference with the variation of feature number; so this last is less interesting for planning optimization applications. Last model rather simply fulfils the task it was designed for by drawing navigation lines.

  3. OpenStructure: a flexible software framework for computational structural biology.

    PubMed

    Biasini, Marco; Mariani, Valerio; Haas, Jürgen; Scheuber, Stefan; Schenk, Andreas D; Schwede, Torsten; Philippsen, Ansgar

    2010-10-15

    Developers of new methods in computational structural biology are often hampered in their research by incompatible software tools and non-standardized data formats. To address this problem, we have developed OpenStructure as a modular open source platform to provide a powerful, yet flexible general working environment for structural bioinformatics. OpenStructure consists primarily of a set of libraries written in C++ with a cleanly designed application programmer interface. All functionality can be accessed directly in C++ or in a Python layer, meeting both the requirements for high efficiency and ease of use. Powerful selection queries and the notion of entity views to represent these selections greatly facilitate the development and implementation of algorithms on structural data. The modular integration of computational core methods with powerful visualization tools makes OpenStructure an ideal working and development environment. Several applications, such as the latest versions of IPLT and QMean, have been implemented based on OpenStructure-demonstrating its value for the development of next-generation structural biology algorithms. Source code licensed under the GNU lesser general public license and binaries for MacOS X, Linux and Windows are available for download at http://www.openstructure.org. torsten.schwede@unibas.ch Supplementary data are available at Bioinformatics online.

  4. Photochemical Modeling Applications

    EPA Pesticide Factsheets

    Provides access to modeling applications involving photochemical models, including modeling of ozone, particulate matter (PM), and mercury for national and regional EPA regulations such as the Clean Air Interstate Rule (CAIR) and the Clean Air Mercury Rule

  5. 40 CFR 63.460 - Applicability and designation of source.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 18, material safety data sheets, or engineering calculations. Wipe cleaning activities, such as using... continuous web cleaning machine subject to this subpart shall achieve compliance with the provisions of this... products, solvent cleaning machines used in the manufacture of narrow tubing, and continuous web cleaning...

  6. 40 CFR 63.460 - Applicability and designation of source.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 18, material safety data sheets, or engineering calculations. Wipe cleaning activities, such as using... continuous web cleaning machine subject to this subpart shall achieve compliance with the provisions of this... products, solvent cleaning machines used in the manufacture of narrow tubing, and continuous web cleaning...

  7. 14 CFR 1260.34 - Clean air and water.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Clean air and water. 1260.34 Section 1260... AGREEMENTS General Provisions § 1260.34 Clean air and water. Clean Air and Water October 2000 (Applicable... the Clean Air Act (42 U.S.C. 1857c-8(c)(1) or the Federal Water Pollution Control Act (33 U.S.C. 1319...

  8. 14 CFR 1260.34 - Clean air and water.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 5 2013-01-01 2013-01-01 false Clean air and water. 1260.34 Section 1260... AGREEMENTS General Provisions § 1260.34 Clean air and water. Clean Air and Water October 2000 (Applicable... the Clean Air Act (42 U.S.C. 1857c-8(c)(1) or the Federal Water Pollution Control Act (33 U.S.C. 1319...

  9. Alternative Fuels Data Center

    Science.gov Websites

    High Occupancy Vehicle (HOV) Lane Exemption Through the Clean Pass Program, eligible plug-in number of occupants in the vehicle. Vehicles must display the Clean Pass vehicle sticker, which is . For a list of eligible vehicles and Clean Pass sticker application instructions, see the Clean Pass

  10. 40 CFR 463.20 - Applicability; description of the cleaning water subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PLASTICS MOLDING AND FORMING POINT SOURCE CATEGORY Cleaning... the cleaning water subcategory are processes where water comes in contact with the plastic product for... equipment, such as molds and mandrels, that contact the plastic material for the purpose of cleaning the...

  11. Peripherally inserted central catheter - dressing change

    MedlinePlus

    ... chlorhexidine) in a single-use small applicator Special sponges or wipes that contain a cleaning agent, such ... Clean your skin around the site with the sponge and cleaning solution for 30 seconds. Let the ...

  12. Optimal design of groundwater remediation system using a probabilistic multi-objective fast harmony search algorithm under uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun

    2014-11-01

    This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.

  13. An Improved Clustering Algorithm of Tunnel Monitoring Data for Cloud Computing

    PubMed Central

    Zhong, Luo; Tang, KunHao; Li, Lin; Yang, Guang; Ye, JingJing

    2014-01-01

    With the rapid development of urban construction, the number of urban tunnels is increasing and the data they produce become more and more complex. It results in the fact that the traditional clustering algorithm cannot handle the mass data of the tunnel. To solve this problem, an improved parallel clustering algorithm based on k-means has been proposed. It is a clustering algorithm using the MapReduce within cloud computing that deals with data. It not only has the advantage of being used to deal with mass data but also is more efficient. Moreover, it is able to compute the average dissimilarity degree of each cluster in order to clean the abnormal data. PMID:24982971

  14. 14 CFR § 1260.34 - Clean air and water.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 5 2014-01-01 2014-01-01 false Clean air and water. § 1260.34 Section Â... AGREEMENTS General Provisions § 1260.34 Clean air and water. Clean Air and Water October 2000 (Applicable... the Clean Air Act (42 U.S.C. 1857c-8(c)(1) or the Federal Water Pollution Control Act (33 U.S.C. 1319...

  15. The Principle and the Application of Self-cleaning Anti-pollution Coating in Power System

    NASA Astrophysics Data System (ADS)

    Zhao, Y. J.; Zhang, Z. B.; Liu, Y.; Wang, J. H.; Teng, J. L.; Wu, L. S.; Zhang, Y. L.

    2017-11-01

    The common problem existed in power system is analyzed in this paper. The main reason for the affection of the safe and stable operation to power equipment is flash-over caused by dirt and discharge. Using the self-cleaning anti-pollution coating in the power equipment surface is the key to solve the problem. In the work, the research progress and design principle about the self-cleaning anti-pollution coating was summarized. Furthermore, the preparation technology was also studied. Finally, the application prospect of hard self-cleaning anti-pollution coating in power system was forecast.

  16. Investigation of radio astronomy image processing techniques for use in the passive millimetre-wave security screening environment

    NASA Astrophysics Data System (ADS)

    Taylor, Christopher T.; Hutchinson, Simon; Salmon, Neil A.; Wilkinson, Peter N.; Cameron, Colin D.

    2014-06-01

    Image processing techniques can be used to improve the cost-effectiveness of future interferometric Passive MilliMetre Wave (PMMW) imagers. The implementation of such techniques will allow for a reduction in the number of collecting elements whilst ensuring adequate image fidelity is maintained. Various techniques have been developed by the radio astronomy community to enhance the imaging capability of sparse interferometric arrays. The most prominent are Multi- Frequency Synthesis (MFS) and non-linear deconvolution algorithms, such as the Maximum Entropy Method (MEM) and variations of the CLEAN algorithm. This investigation focuses on the implementation of these methods in the defacto standard for radio astronomy image processing, the Common Astronomy Software Applications (CASA) package, building upon the discussion presented in Taylor et al., SPIE 8362-0F. We describe the image conversion process into a CASA suitable format, followed by a series of simulations that exploit the highlighted deconvolution and MFS algorithms assuming far-field imagery. The primary target application used for this investigation is an outdoor security scanner for soft-sided Heavy Goods Vehicles. A quantitative analysis of the effectiveness of the aforementioned image processing techniques is presented, with thoughts on the potential cost-savings such an approach could yield. Consideration is also given to how the implementation of these techniques in CASA might be adapted to operate in a near-field target environment. This may enable a much wider usability by the imaging community outside of radio astronomy and thus would be directly relevant to portal screening security systems in the microwave and millimetre wave bands.

  17. Document analysis with neural net circuits

    NASA Technical Reports Server (NTRS)

    Graf, Hans Peter

    1994-01-01

    Document analysis is one of the main applications of machine vision today and offers great opportunities for neural net circuits. Despite more and more data processing with computers, the number of paper documents is still increasing rapidly. A fast translation of data from paper into electronic format is needed almost everywhere, and when done manually, this is a time consuming process. Markets range from small scanners for personal use to high-volume document analysis systems, such as address readers for the postal service or check processing systems for banks. A major concern with present systems is the accuracy of the automatic interpretation. Today's algorithms fail miserably when noise is present, when print quality is poor, or when the layout is complex. A common approach to circumvent these problems is to restrict the variations of the documents handled by a system. In our laboratory, we had the best luck with circuits implementing basic functions, such as convolutions, that can be used in many different algorithms. To illustrate the flexibility of this approach, three applications of the NET32K circuit are described in this short viewgraph presentation: locating address blocks, cleaning document images by removing noise, and locating areas of interest in personal checks to improve image compression. Several of the ideas realized in this circuit that were inspired by neural nets, such as analog computation with a low resolution, resulted in a chip that is well suited for real-world document analysis applications and that compares favorably with alternative, 'conventional' circuits.

  18. 78 FR 76829 - Approval of Application Submitted by Eastern Shoshone Tribe and Northern Arapaho Tribe for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-19

    ... regulatory authority under the Clean Air Act. DATES: EPA's decision approving the Tribes' TAS application was... Decision Document, Attachment 1 (Legal Analysis of the Wind River Indian Reservation Boundary), Attachment... decision to approve the application does not approve, Tribal authority to implement any Clean Air Act...

  19. 40 CFR 463.20 - Applicability; description of the cleaning water subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Processes in the cleaning water subcategory are processes where water comes in contact with the plastic product for the purpose of cleaning the surface of the product and where water comes in contact with...

  20. 40 CFR 463.20 - Applicability; description of the cleaning water subcategory.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Processes in the cleaning water subcategory are processes where water comes in contact with the plastic product for the purpose of cleaning the surface of the product and where water comes in contact with...

  1. 40 CFR 463.20 - Applicability; description of the cleaning water subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the cleaning water subcategory are processes where water comes in contact with the plastic product for the purpose of cleaning the surface of the product and where water comes in contact with shaping...

  2. 40 CFR 463.20 - Applicability; description of the cleaning water subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Processes in the cleaning water subcategory are processes where water comes in contact with the plastic product for the purpose of cleaning the surface of the product and where water comes in contact with...

  3. 40 CFR 144.31 - Application for a permit; authorization by permit.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Significant Deterioration (PSD) program under the Clean Air Act. (2) Name, mailing address, and location of... (PSD) program under the Clean Air Act. (v) Nonattainment program under the Clean Air Act. (vi) National...

  4. 40 CFR 144.31 - Application for a permit; authorization by permit.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Significant Deterioration (PSD) program under the Clean Air Act. (2) Name, mailing address, and location of... (PSD) program under the Clean Air Act. (v) Nonattainment program under the Clean Air Act. (vi) National...

  5. 40 CFR 144.31 - Application for a permit; authorization by permit.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Significant Deterioration (PSD) program under the Clean Air Act. (2) Name, mailing address, and location of... (PSD) program under the Clean Air Act. (v) Nonattainment program under the Clean Air Act. (vi) National...

  6. 40 CFR 144.31 - Application for a permit; authorization by permit.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Significant Deterioration (PSD) program under the Clean Air Act. (2) Name, mailing address, and location of... (PSD) program under the Clean Air Act. (v) Nonattainment program under the Clean Air Act. (vi) National...

  7. Cleaning Process Development for Metallic Additively Manufactured Parts

    NASA Technical Reports Server (NTRS)

    Tramel, Terri L.; Welker, Roger; Lowery, Niki; Mitchell, Mark

    2014-01-01

    Additive Manufacturing of metallic components for aerospace applications offers many advantages over traditional manufacturing techniques. As a new technology, many aspects of its widespread utilization remain open to investigation. Among these are the cleaning processes that can be used for post finishing of parts and measurements to verify effectiveness of the cleaning processes. Many cleaning and drying processes and measurement methods that have been used for parts manufactured using conventional techniques are candidates that may be considered for cleaning and verification of additively manufactured parts. Among these are vapor degreasing, ultrasonic immersion and spray cleaning, followed by hot air drying, vacuum baking and solvent displacement drying. Differences in porosity, density, and surface finish of additively manufactured versus conventionally manufactured parts may introduce new considerations in the selection of cleaning and drying processes or the method used to verify their effectiveness. This presentation will review the relative strengths and weaknesses of different candidate cleaning and drying processes as they may apply to additively manufactured metal parts for aerospace applications. An ultrasonic cleaning technique for exploring the cleanability of parts will be presented along with an example using additively manufactured Inconel 718 test specimens to illustrate its use. The data analysis shows that this ultrasonic cleaning approach results in a well-behaved ultrasonic cleaning/extraction behavior. That is, it does not show signs of accelerated cavitation erosion of the base material, which was later confirmed by neutron imaging. In addition, the analysis indicated that complete cleaning would be achieved by ultrasonic immersion cleaning at approximately 5 minutes, which was verified by subsequent cleaning of additional parts.

  8. Scalable splitting algorithms for big-data interferometric imaging in the SKA era

    NASA Astrophysics Data System (ADS)

    Onose, Alexandru; Carrillo, Rafael E.; Repetti, Audrey; McEwen, Jason D.; Thiran, Jean-Philippe; Pesquet, Jean-Christophe; Wiaux, Yves

    2016-11-01

    In the context of next-generation radio telescopes, like the Square Kilometre Array (SKA), the efficient processing of large-scale data sets is extremely important. Convex optimization tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimization algorithmic structures able to solve the convex optimization tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy, with the CLEAN major-minor cycle, as running sophisticated CLEAN-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularization function, in particular, the well-studied ℓ1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomization, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our MATLAB code is available online on GitHub.

  9. Safety assessment of the use of Bacillus-based cleaning products.

    PubMed

    Berg, Ninna W; Evans, Matthew R; Sedivy, John; Testman, Robert; Acedo, Kimon; Paone, Domenic; Long, David; Osimitz, Thomas G

    2018-06-01

    Non-pathogenic Bacillus species used in cleaning products produce the appropriate enzymes to degrade stains and soils. However, there is little scientific data regarding the human exposure by inhalation of Bacillus spores during or after use of microbial-based cleaning products. Herein, air samples were collected at various locations in a ventilated, carpeted, residential room to determine the air concentration of viable bacteria and spores during and after the application of microbial-based carpet cleaning products containing Bacillus spores. The influence of human activities and vacuuming was investigated. Bioaerosol levels associated with use and post-application activities of whole room carpet treatments were elevated during post-application activity, but quickly returned to the indoor background range. Use of trigger spray spot applications generated aerosolized spores in the immediate vicinity, however, their use pattern and the generation of mostly non-respirable particles suggest minimal risks for pulmonary exposure from their use. The aerosol counts associated with use of these microbial-based cleaners were below the recommendation for safe exposure levels to non-pathogenic and non-toxigenic microorganisms except during application of the spot cleaner. The data presented suggest that carpet cleaning products, containing non-pathogenic Bacillus spores present a low potential for inhalation exposure and consequently minimal risk of adverse effects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. A Toolbox to Improve Algorithms for Insulin-Dosing Decision Support

    PubMed Central

    Donsa, K.; Plank, J.; Schaupp, L.; Mader, J. K.; Truskaller, T.; Tschapeller, B.; Höll, B.; Spat, S.; Pieber, T. R.

    2014-01-01

    Summary Background Standardized insulin order sets for subcutaneous basal-bolus insulin therapy are recommended by clinical guidelines for the inpatient management of diabetes. The algorithm based GlucoTab system electronically assists health care personnel by supporting clinical workflow and providing insulin-dose suggestions. Objective To develop a toolbox for improving clinical decision-support algorithms. Methods The toolbox has three main components. 1) Data preparation: Data from several heterogeneous sources is extracted, cleaned and stored in a uniform data format. 2) Simulation: The effects of algorithm modifications are estimated by simulating treatment workflows based on real data from clinical trials. 3) Analysis: Algorithm performance is measured, analyzed and simulated by using data from three clinical trials with a total of 166 patients. Results Use of the toolbox led to algorithm improvements as well as the detection of potential individualized subgroup-specific algorithms. Conclusion These results are a first step towards individualized algorithm modifications for specific patient subgroups. PMID:25024768

  11. RESOLVE: A new algorithm for aperture synthesis imaging of extended emission in radio astronomy

    NASA Astrophysics Data System (ADS)

    Junklewitz, H.; Bell, M. R.; Selig, M.; Enßlin, T. A.

    2016-02-01

    We present resolve, a new algorithm for radio aperture synthesis imaging of extended and diffuse emission in total intensity. The algorithm is derived using Bayesian statistical inference techniques, estimating the surface brightness in the sky assuming a priori log-normal statistics. resolve estimates the measured sky brightness in total intensity, and the spatial correlation structure in the sky, which is used to guide the algorithm to an optimal reconstruction of extended and diffuse sources. During this process, the algorithm succeeds in deconvolving the effects of the radio interferometric point spread function. Additionally, resolve provides a map with an uncertainty estimate of the reconstructed surface brightness. Furthermore, with resolve we introduce a new, optimal visibility weighting scheme that can be viewed as an extension to robust weighting. In tests using simulated observations, the algorithm shows improved performance against two standard imaging approaches for extended sources, Multiscale-CLEAN and the Maximum Entropy Method.

  12. SURVEY OF AIR AND GAS CLEANING OPERATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgenthaler, A.C.

    1959-09-01

    An informative summary of air and gas cleaning operations in the Chemicai Processing Department of the Hanfor Atomic Products Operation, Richland, Washington, is presented. Descriptlons of the fundamental components of cleaning systems, their applications, and cost information are included. (R.G.G.)

  13. Assessment of the potential suitability of selected commercially available enzymes for cleaning-in-place (CIP) in the dairy industry.

    PubMed

    Boyce, Angela; Piterina, Anna V; Walsh, Gary

    2010-10-01

    The potential suitability of 10 commercial protease and lipase products for cleaning-in-place (CIP) application in the dairy industry was investigated on a laboratory scale. Assessment was based primarily on the ability of the enzymes to remove an experimentally generated milk fouling deposit from stainless steel (SS) panels. Three protease products were identified as being most suitable for this application on the basis of their cleaning performance at 40 °C, which was comparable to that of the commonly used cleaning agent, 1% NaOH at 60 °C. This was judged by quantification of residual organic matter and protein on the SS surface after cleaning and analysis by laser scanning confocal microscopy (LSCM). Enzyme activity was removed/inactivated under conditions simulating those normally undertaken after cleaning (rinsing with water, acid circulation, sanitation). Preliminary process-scale studies strongly suggest that enzyme-based CIP achieves satisfactory cleaning at an industrial scale. Cost analysis indicates that replacing caustic-based cleaning procedures with biodegradable enzymes operating at lower temperatures would be economically viable. Additional potential benefits include decreased energy and water consumption, improved safety, reduced waste generation, greater compatibility with wastewater treatment processes and a reduction in the environmental impact of the cleaning process.

  14. Supersonic gas-liquid cleaning system

    NASA Technical Reports Server (NTRS)

    Caimi, Raoul E. B.; Thaxton, Eric A.

    1994-01-01

    A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.

  15. Supersonic gas-liquid cleaning system

    NASA Astrophysics Data System (ADS)

    Caimi, Raoul E. B.; Thaxton, Eric A.

    1994-02-01

    A system to perform cleaning and cleanliness verification is being developed to replace solvent flush methods using CFC 113 for fluid system components. The system is designed for two purposes: internal and external cleaning and verification. External cleaning is performed with the nozzle mounted at the end of a wand similar to a conventional pressure washer. Internal cleaning is performed with a variety of fixtures designed for specific applications. Internal cleaning includes tubes, pipes, flex hoses, and active fluid components such as valves and regulators. The system uses gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the object to be cleaned. Compressed air or any inert gas may be used to provide the conveying medium for the liquid. The converging-diverging nozzles accelerate the gas-liquid mixture to supersonic velocities. The liquid being accelerated may be any solvent including water. This system may be used commercially to replace CFC and other solvent cleaning methods widely used to remove dust, dirt, flux, and lubricants. In addition, cleanliness verification can be performed without the solvents which are typically involved. This paper will present the technical details of the system, the results achieved during testing at KSC, and future applications for this system.

  16. Data Mining.

    ERIC Educational Resources Information Center

    Benoit, Gerald

    2002-01-01

    Discusses data mining (DM) and knowledge discovery in databases (KDD), taking the view that KDD is the larger view of the entire process, with DM emphasizing the cleaning, warehousing, mining, and visualization of knowledge discovery in databases. Highlights include algorithms; users; the Internet; text mining; and information extraction.…

  17. Evaluation of Surface Sampling for Bacillus Spores Using ...

    EPA Pesticide Factsheets

    Journal Article In this study, commercially-available domestic cleaning robots were evaluated for spore surface sampling efficiency on common indoor surfaces. The current study determined the sampling efficiency of each robot, without modifying the sensors, algorithms, or logics set by the manufacturers.

  18. Study and development of 22 kW peak power fiber coupled short pulse Nd:YAG laser for cleaning applications

    NASA Astrophysics Data System (ADS)

    Choubey, Ambar; Vishwakarma, S. C.; Vachhani, D. M.; Singh, Ravindra; Misra, Pushkar; Jain, R. K.; Arya, R.; Upadhyaya, B. N.; Oak, S. M.

    2014-11-01

    Free running short pulse Nd:YAG laser of microsecond pulse duration and high peak power has a unique capability to ablate material from the surface without heat propagation into the bulk. Applications of short pulse Nd:YAG lasers include cleaning and restoration of marble, stones, and a variety of metals for conservation. A study on the development of high peak power short pulses from Nd:YAG laser along with its cleaning and conservation applications has been performed. A pulse energy of 1.25 J with 55 μs pulse duration and a maximum peak power of 22 kW has been achieved. Laser beam has an M2 value of ~28 and a pulse-to-pulse stability of ±2.5%. A lower value of M2 means a better beam quality of the laser in multimode operation. A top hat spatial profile of the laser beam was achieved at the exit end of 200 μm core diameter optical fiber, which is desirable for uniform cleaning. This laser system has been evaluated for efficient cleaning of surface contaminations on marble, zircaloy, and inconel materials for conservation with cleaning efficiency as high as 98%. Laser's cleaning quality and efficiency have been analysed by using a microscope, a scanning electron microscope (SEM), and X-ray photon spectroscopy (XPS) measurements.

  19. A pruning algorithm for Meta-blocking based on cumulative weight

    NASA Astrophysics Data System (ADS)

    Zhang, Fulin; Gao, Zhipeng; Niu, Kun

    2017-08-01

    Entity Resolution is an important process in data cleaning and data integration. It usually employs a blocking method to avoid the quadratic complexity work when scales to large data sets. Meta-blocking can perform better in the context of highly heterogeneous information spaces. Yet, its precision and efficiency still have room to improve. In this paper, we present a new pruning algorithm for Meta-Blocking. It can achieve a higher precision than the existing WEP algorithm at a small cost of recall. In addition, can reduce the runtime of the blocking process. We evaluate our proposed method over five real-world data sets.

  20. 40 CFR 63.803 - Work practice standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... containers for storing finishing, gluing, cleaning, and washoff materials. (h) Application equipment... solvent used for line cleaning into a normally closed container. (j) Gun cleaning. Each owner or operator... closed container. (k) Washoff operations. Each owner or operator of an affected source shall control...

  1. 40 CFR 63.803 - Work practice standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... containers for storing finishing, gluing, cleaning, and washoff materials. (h) Application equipment... solvent used for line cleaning into a normally closed container. (j) Gun cleaning. Each owner or operator... closed container. (k) Washoff operations. Each owner or operator of an affected source shall control...

  2. 40 CFR 63.803 - Work practice standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... containers for storing finishing, gluing, cleaning, and washoff materials. (h) Application equipment... solvent used for line cleaning into a normally closed container. (j) Gun cleaning. Each owner or operator... closed container. (k) Washoff operations. Each owner or operator of an affected source shall control...

  3. Preliminary test results of a flight management algorithm for fuel conservative descents in a time based metered traffic environment. [flight tests of an algorithm to minimize fuel consumption of aircraft based on flight time

    NASA Technical Reports Server (NTRS)

    Knox, C. E.; Cannon, D. G.

    1979-01-01

    A flight management algorithm designed to improve the accuracy of delivering the airplane fuel efficiently to a metering fix at a time designated by air traffic control is discussed. The algorithm provides a 3-D path with time control (4-D) for a test B 737 airplane to make an idle thrust, clean configured descent to arrive at the metering fix at a predetermined time, altitude, and airspeed. The descent path is calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard pressure and temperature effects. The flight management descent algorithms and the results of the flight tests are discussed.

  4. Design and realization of disaster assessment algorithm after forest fire

    NASA Astrophysics Data System (ADS)

    Xu, Aijun; Wang, Danfeng; Tang, Lihua

    2008-10-01

    Based on GIS technology, this paper mainly focuses on the application of disaster assessment algorithm after forest fire and studies on the design and realization of disaster assessment based on GIS. After forest fire through the analysis and processing of multi-sources and heterogeneous data, this paper integrates the foundation that the domestic and foreign scholars laid of the research on assessment for forest fire loss with the related knowledge of assessment, accounting and forest resources appraisal so as to study and approach the theory framework and assessment index of the research on assessment for forest fire loss. The technologies of extracting boundary, overlay analysis, and division processing of multi-sources spatial data are available to realize the application of the investigation method of the burnt forest area and the computation of the fire area. The assessment provides evidence for fire cleaning in burnt areas and new policy making on restoration in terms of the direct and the indirect economic loss and ecological and environmental damage caused by forest fire under the condition of different fire danger classes and different amounts of forest accumulation, thus makes forest resources protection operated in a faster, more efficient and more economical way. Finally, this paper takes Lin'an city of Zhejiang province as a test area to confirm the method mentioned in the paper in terms of key technologies.

  5. Application of concepts from cross-recurrence analysis in speech production: an overview and comparison with other nonlinear methods.

    PubMed

    Lancia, Leonardo; Fuchs, Susanne; Tiede, Mark

    2014-06-01

    The aim of this article was to introduce an important tool, cross-recurrence analysis, to speech production applications by showing how it can be adapted to evaluate the similarity of multivariate patterns of articulatory motion. The method differs from classical applications of cross-recurrence analysis because no phase space reconstruction is conducted, and a cleaning algorithm removes the artifacts from the recurrence plot. The main features of the proposed approach are robustness to nonstationarity and efficient separation of amplitude variability from temporal variability. The authors tested these claims by applying their method to synthetic stimuli whose variability had been carefully controlled. The proposed method was also demonstrated in a practical application: It was used to investigate the role of biomechanical constraints in articulatory reorganization as a consequence of speeded repetition of CVCV utterances containing a labial and a coronal consonant. Overall, the proposed approach provided more reliable results than other methods, particularly in the presence of high variability. The proposed method is a useful and appropriate tool for quantifying similarity and dissimilarity in patterns of speech articulator movement, especially in such research areas as speech errors and pathologies, where unpredictable divergent behavior is expected.

  6. Ultra-high heat flux cooling characteristics of cryogenic micro-solid nitrogen particles and its application to semiconductor wafer cleaning technology

    NASA Astrophysics Data System (ADS)

    Ishimoto, Jun; Oh, U.; Guanghan, Zhao; Koike, Tomoki; Ochiai, Naoya

    2014-01-01

    The ultra-high heat flux cooling characteristics and impingement behavior of cryogenic micro-solid nitrogen (SN2) particles in relation to a heated wafer substrate were investigated for application to next generation semiconductor wafer cleaning technology. The fundamental characteristics of cooling heat transfer and photoresist removal-cleaning performance using micro-solid nitrogen particulate spray impinging on a heated substrate were numerically investigated and experimentally measured by a new type of integrated computational-experimental technique. This study contributes not only advanced cryogenic cooling technology for high thermal emission devices, but also to the field of nano device engineering including the semiconductor wafer cleaning technology.

  7. Novel approaches to assess the quality of fertility data stored in dairy herd management software.

    PubMed

    Hermans, K; Waegeman, W; Opsomer, G; Van Ranst, B; De Koster, J; Van Eetvelde, M; Hostens, M

    2017-05-01

    Scientific journals and popular press magazines are littered with articles in which the authors use data from dairy herd management software. Almost none of such papers include data cleaning and data quality assessment in their study design despite this being a very critical step during data mining. This paper presents 2 novel data cleaning methods that permit identification of animals with good and bad data quality. The first method is a deterministic or rule-based data cleaning method. Reproduction and mutation or life-changing events such as birth and death were converted to a symbolic (alphabetical letter) representation and split into triplets (3-letter code). The triplets were manually labeled as physiologically correct, suspicious, or impossible. The deterministic data cleaning method was applied to assess the quality of data stored in dairy herd management from 26 farms enrolled in the herd health management program from the Faculty of Veterinary Medicine Ghent University, Belgium. In total, 150,443 triplets were created, 65.4% were labeled as correct, 17.4% as suspicious, and 17.2% as impossible. The second method, a probabilistic method, uses a machine learning algorithm (random forests) to predict the correctness of fertility and mutation events in an early stage of data cleaning. The prediction accuracy of the random forests algorithm was compared with a classical linear statistical method (penalized logistic regression), outperforming the latter substantially, with a superior receiver operating characteristic curve and a higher accuracy (89 vs. 72%). From those results, we conclude that the triplet method can be used to assess the quality of reproduction data stored in dairy herd management software and that a machine learning technique such as random forests is capable of predicting the correctness of fertility data. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. A masked least-squares smoothing procedure for artifact reduction in scanning-EMG recordings.

    PubMed

    Corera, Íñigo; Eciolaza, Adrián; Rubio, Oliver; Malanda, Armando; Rodríguez-Falces, Javier; Navallas, Javier

    2018-01-11

    Scanning-EMG is an electrophysiological technique in which the electrical activity of the motor unit is recorded at multiple points along a corridor crossing the motor unit territory. Correct analysis of the scanning-EMG signal requires prior elimination of interference from nearby motor units. Although the traditional processing based on the median filtering is effective in removing such interference, it distorts the physiological waveform of the scanning-EMG signal. In this study, we describe a new scanning-EMG signal processing algorithm that preserves the physiological signal waveform while effectively removing interference from other motor units. To obtain a cleaned-up version of the scanning signal, the masked least-squares smoothing (MLSS) algorithm recalculates and replaces each sample value of the signal using a least-squares smoothing in the spatial dimension, taking into account the information of only those samples that are not contaminated with activity of other motor units. The performance of the new algorithm with simulated scanning-EMG signals is studied and compared with the performance of the median algorithm and tested with real scanning signals. Results show that the MLSS algorithm distorts the waveform of the scanning-EMG signal much less than the median algorithm (approximately 3.5 dB gain), being at the same time very effective at removing interference components. Graphical Abstract The raw scanning-EMG signal (left figure) is processed by the MLSS algorithm in order to remove the artifact interference. Firstly, artifacts are detected from the raw signal, obtaining a validity mask (central figure) that determines the samples that have been contaminated by artifacts. Secondly, a least-squares smoothing procedure in the spatial dimension is applied to the raw signal using the not contaminated samples according to the validity mask. The resulting MLSS-processed scanning-EMG signal (right figure) is clean of artifact interference.

  9. 14 CFR 1260.34 - Clean air and water.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 5 2011-01-01 2010-01-01 true Clean air and water. 1260.34 Section 1260.34... Provisions § 1260.34 Clean air and water. Clean Air and Water October 2000 (Applicable only if the award... (42 U.S.C. 1857c-8(c)(1) or the Federal Water Pollution Control Act (33 U.S.C. 1319(c)), and is listed...

  10. Research on Operation Strategy for Bundled Wind-thermal Generation Power Systems Based on Two-Stage Optimization Model

    NASA Astrophysics Data System (ADS)

    Sun, Congcong; Wang, Zhijie; Liu, Sanming; Jiang, Xiuchen; Sheng, Gehao; Liu, Tianyu

    2017-05-01

    Wind power has the advantages of being clean and non-polluting and the development of bundled wind-thermal generation power systems (BWTGSs) is one of the important means to improve wind power accommodation rate and implement “clean alternative” on generation side. A two-stage optimization strategy for BWTGSs considering wind speed forecasting results and load characteristics is proposed. By taking short-term wind speed forecasting results of generation side and load characteristics of demand side into account, a two-stage optimization model for BWTGSs is formulated. By using the environmental benefit index of BWTGSs as the objective function, supply-demand balance and generator operation as the constraints, the first-stage optimization model is developed with the chance-constrained programming theory. By using the operation cost for BWTGSs as the objective function, the second-stage optimization model is developed with the greedy algorithm. The improved PSO algorithm is employed to solve the model and numerical test verifies the effectiveness of the proposed strategy.

  11. 75 FR 51239 - Application(s) for Duty-Free Entry of Scientific Instruments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-19

    ... electrically conduction as well as insulation nanostructures prepared by in situ deposition onto clean surfaces. In-situ capacities allow the preparation of clean and well-defined nanostructures on pristine... these experiments. The instrument also has in-situ preparation capability and the ability to operate in...

  12. 40 CFR 63.460 - Applicability and designation of source.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... cold, and batch cold solvent cleaning machine that uses any solvent containing methylene chloride (CAS... combination of these halogenated HAP solvents, in a total concentration greater than 5 percent by weight, as a... to owners or operators of any solvent cleaning machine meeting the applicability criteria of...

  13. 40 CFR 63.460 - Applicability and designation of source.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... cold, and batch cold solvent cleaning machine that uses any solvent containing methylene chloride (CAS... combination of these halogenated HAP solvents, in a total concentration greater than 5 percent by weight, as a... to owners or operators of any solvent cleaning machine meeting the applicability criteria of...

  14. 40 CFR 63.460 - Applicability and designation of source.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... cold, and batch cold solvent cleaning machine that uses any solvent containing methylene chloride (CAS... combination of these halogenated HAP solvents, in a total concentration greater than 5 percent by weight, as a... to owners or operators of any solvent cleaning machine meeting the applicability criteria of...

  15. Fogging technique used to coat magnesium with plastic

    NASA Technical Reports Server (NTRS)

    Mroz, T. S.

    1967-01-01

    Cleaning process and a fogging technique facilitate the application of a plastic coating to magnesium plates. The cleaning process removes general organic and inorganic surface impurities, oils and greases, and oxides and carbonates from the magnesium surfaces. The fogging technique produces a thin-filmlike coating in a clean room atmosphere.

  16. 40 CFR 35.3000 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... and purposes of the Clean Water Act and applicable regulations. (c) Where a Tribe has previously qualified for treatment as a State under a Clean Water Act or Safe Drinking Water Act program, the Tribe... of the wastewater treatment works construction grants program under section 205(g) of the Clean Water...

  17. 40 CFR 35.3000 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... and purposes of the Clean Water Act and applicable regulations. (c) Where a Tribe has previously qualified for treatment as a State under a Clean Water Act or Safe Drinking Water Act program, the Tribe... of the wastewater treatment works construction grants program under section 205(g) of the Clean Water...

  18. 40 CFR 35.3000 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and purposes of the Clean Water Act and applicable regulations. (c) Where a Tribe has previously qualified for treatment as a State under a Clean Water Act or Safe Drinking Water Act program, the Tribe... of the wastewater treatment works construction grants program under section 205(g) of the Clean Water...

  19. 40 CFR 35.3000 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... and purposes of the Clean Water Act and applicable regulations. (c) Where a Tribe has previously qualified for treatment as a State under a Clean Water Act or Safe Drinking Water Act program, the Tribe... of the wastewater treatment works construction grants program under section 205(g) of the Clean Water...

  20. 40 CFR 35.3000 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... and purposes of the Clean Water Act and applicable regulations. (c) Where a Tribe has previously qualified for treatment as a State under a Clean Water Act or Safe Drinking Water Act program, the Tribe... of the wastewater treatment works construction grants program under section 205(g) of the Clean Water...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandor, Debra; Chung, Donald; Keyser, David

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  2. Cleaning by clustering: methodology for addressing data quality issues in biomedical metadata.

    PubMed

    Hu, Wei; Zaveri, Amrapali; Qiu, Honglei; Dumontier, Michel

    2017-09-18

    The ability to efficiently search and filter datasets depends on access to high quality metadata. While most biomedical repositories require data submitters to provide a minimal set of metadata, some such as the Gene Expression Omnibus (GEO) allows users to specify additional metadata in the form of textual key-value pairs (e.g. sex: female). However, since there is no structured vocabulary to guide the submitter regarding the metadata terms to use, consequently, the 44,000,000+ key-value pairs in GEO suffer from numerous quality issues including redundancy, heterogeneity, inconsistency, and incompleteness. Such issues hinder the ability of scientists to hone in on datasets that meet their requirements and point to a need for accurate, structured and complete description of the data. In this study, we propose a clustering-based approach to address data quality issues in biomedical, specifically gene expression, metadata. First, we present three different kinds of similarity measures to compare metadata keys. Second, we design a scalable agglomerative clustering algorithm to cluster similar keys together. Our agglomerative cluster algorithm identified metadata keys that were similar, based on (i) name, (ii) core concept and (iii) value similarities, to each other and grouped them together. We evaluated our method using a manually created gold standard in which 359 keys were grouped into 27 clusters based on six types of characteristics: (i) age, (ii) cell line, (iii) disease, (iv) strain, (v) tissue and (vi) treatment. As a result, the algorithm generated 18 clusters containing 355 keys (four clusters with only one key were excluded). In the 18 clusters, there were keys that were identified correctly to be related to that cluster, but there were 13 keys which were not related to that cluster. We compared our approach with four other published methods. Our approach significantly outperformed them for most metadata keys and achieved the best average F-Score (0.63). Our algorithm identified keys that were similar to each other and grouped them together. Our intuition that underpins cleaning by clustering is that, dividing keys into different clusters resolves the scalability issues for data observation and cleaning, and keys in the same cluster with duplicates and errors can easily be found. Our algorithm can also be applied to other biomedical data types.

  3. Two Procedures to Flag Radio Frequency Interference in the UV Plane

    NASA Astrophysics Data System (ADS)

    Sekhar, Srikrishna; Athreya, Ramana

    2018-07-01

    We present two algorithms to identify and flag radio frequency interference (RFI) in radio interferometric imaging data. The first algorithm utilizes the redundancy of visibilities inside a UV cell in the visibility plane to identify corrupted data, while varying the detection threshold in accordance with the observed reduction in noise with radial UV distance. In the second algorithm, we propose a scheme to detect faint RFI in the visibility time-channel (TC) plane of baselines. The efficacy of identifying RFI in the residual visibilities is reduced by the presence of ripples due to inaccurate subtraction of the strongest sources. This can be due to several reasons including primary beam asymmetries and other direction-dependent calibration errors. We eliminated these ripples by clipping the corresponding peaks in the associated Fourier plane. RFI was detected in the ripple-free TC plane but was flagged in the original visibilities. Application of these two algorithms to five different 150 MHz data sets from the GMRT resulted in a reduction in image noise of 20%–50% throughout the field along with a reduction in systematics and a corresponding increase in the number of detected sources. However, in comparing the mean flux densities before and after flagging RFI, we find a differential change with the fainter sources (25σ < S < 100 mJy) showing a change of ‑6% to +1% relative to the stronger sources (S > 100 mJy). We are unable to explain this effect, but it could be related to the CLEAN bias known for interferometers.

  4. An analysis dictionary learning algorithm under a noisy data model with orthogonality constraint.

    PubMed

    Zhang, Ye; Yu, Tenglong; Wang, Wenwu

    2014-01-01

    Two common problems are often encountered in analysis dictionary learning (ADL) algorithms. The first one is that the original clean signals for learning the dictionary are assumed to be known, which otherwise need to be estimated from noisy measurements. This, however, renders a computationally slow optimization process and potentially unreliable estimation (if the noise level is high), as represented by the Analysis K-SVD (AK-SVD) algorithm. The other problem is the trivial solution to the dictionary, for example, the null dictionary matrix that may be given by a dictionary learning algorithm, as discussed in the learning overcomplete sparsifying transform (LOST) algorithm. Here we propose a novel optimization model and an iterative algorithm to learn the analysis dictionary, where we directly employ the observed data to compute the approximate analysis sparse representation of the original signals (leading to a fast optimization procedure) and enforce an orthogonality constraint on the optimization criterion to avoid the trivial solutions. Experiments demonstrate the competitive performance of the proposed algorithm as compared with three baselines, namely, the AK-SVD, LOST, and NAAOLA algorithms.

  5. Geostatistical applications in environmental remediation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, R.N.; Purucker, S.T.; Lyon, B.F.

    1995-02-01

    Geostatistical analysis refers to a collection of statistical methods for addressing data that vary in space. By incorporating spatial information into the analysis, geostatistics has advantages over traditional statistical analysis for problems with a spatial context. Geostatistics has a history of success in earth science applications, and its popularity is increasing in other areas, including environmental remediation. Due to recent advances in computer technology, geostatistical algorithms can be executed at a speed comparable to many standard statistical software packages. When used responsibly, geostatistics is a systematic and defensible tool can be used in various decision frameworks, such as the Datamore » Quality Objectives (DQO) process. At every point in the site, geostatistics can estimate both the concentration level and the probability or risk of exceeding a given value. Using these probability maps can assist in identifying clean-up zones. Given any decision threshold and an acceptable level of risk, the probability maps identify those areas that are estimated to be above or below the acceptable risk. Those areas that are above the threshold are of the most concern with regard to remediation. In addition to estimating clean-up zones, geostatistics can assist in designing cost-effective secondary sampling schemes. Those areas of the probability map with high levels of estimated uncertainty are areas where more secondary sampling should occur. In addition, geostatistics has the ability to incorporate soft data directly into the analysis. These data include historical records, a highly correlated secondary contaminant, or expert judgment. The role of geostatistics in environmental remediation is a tool that in conjunction with other methods can provide a common forum for building consensus.« less

  6. Gas cleaning system and method

    DOEpatents

    Newby, Richard Allen

    2006-06-06

    A gas cleaning system for removing at least a portion of contaminants, such as halides, sulfur, particulates, mercury, and others, from a synthesis gas (syngas). The gas cleaning system may include one or more filter vessels coupled in series for removing halides, particulates, and sulfur from the syngas. The gas cleaning system may be operated by receiving gas at a first temperature and pressure and dropping the temperature of the syngas as the gas flows through the system. The gas cleaning system may be used for an application requiring clean syngas, such as, but not limited to, fuel cell power generation, IGCC power generation, and chemical synthesis.

  7. 77 FR 43860 - Notice of Lodging of Consent Decree Pursuant to the Clean Water Act

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-26

    ... water, and applicable oil pollution prevention regulations, at Fairhaven Shipyard's two facilities at 50... DEPARTMENT OF JUSTICE Notice of Lodging of Consent Decree Pursuant to the Clean Water Act In... Companies, Inc. (``Fairhaven Shipyard'') violated Sections 301, 311, and 402 of the Clean Water Act, 33 U.S...

  8. Clean room technology in surgery suites

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The principles of clean room technology and the criteria for their application to surgery are discussed. The basic types of surgical clean rooms are presented along with their advantages and disadvantages. Topics discussed include: microbiology of surgery suites; principles of laminar airflow systems, and their use in surgery; and asepsis and the operating room.

  9. Membrane cleaning with ultrasonically driven bubbles.

    PubMed

    Reuter, Fabian; Lauterborn, Sonja; Mettin, Robert; Lauterborn, Werner

    2017-07-01

    A laboratory filtration plant for drinking water treatment is constructed to study the conditions for purely mechanical in situ cleaning of fouled polymeric membranes by the application of ultrasound. The filtration is done by suction of water with defined constant contamination through a membrane module, a stack of five pairs of flat-sheet ultrafiltration membranes. The short cleaning cycle to remove the cake layer from the membranes includes backwashing, the application of ultrasound and air flushing. A special geometry for sound irradiation of the membranes parallel to their surfaces is chosen. Two frequencies, 35kHz and 130kHz, and different driving powers are tested for their cleaning effectiveness. No cleaning is found for 35kHz, whereas good cleaning results are obtained for 130kHz, with an optimum cleaning effectiveness at moderate driving powers. Acoustic and optic measurements in space and time as well as analytical considerations and numerical calculations reveal the reasons and confirm the experimental results. The sound field is measured in high resolution and bubble structures are high-speed imaged on their nucleation sites as well as during their cleaning work at the membrane surface. The microscopic inspection of the membrane surface after cleaning shows distinct cleaning types in the cake layer that are related to specific bubble behaviour on the membrane. The membrane integrity and permeate quality are checked on-line by particle counting and turbidity measurement of the permeate. No signs of membrane damage or irreversible membrane degradation in permeability are detected and an excellent water permeate quality is retained. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Alternative Fuels Data Center

    Science.gov Websites

    Clean Diesel Grant The Rhode Island Clean Diesel Fund provides companies with reimbursement grants receiving the grant. For more information, including eligibility and application requirements, see the Rhode

  11. Methods/Labor Standards Application Program - Phase IV

    DTIC Science & Technology

    1985-01-01

    Engine Platform a. Pressure switch b. Compressor motor c. Voltage regulator d. Open and clean generator exciter and main windings S3 . Main Collector...clean motors b. Slip rings Gantry #3 Annual: S2. Engine Platform a. Pressure switch b. Compressor motor Voltage regulator d. Open and clean generator...Travel Motors Open and clean motorsa. b. Slip rings Gantry #4 S2 . S3. S4 . S5 . Engine Platform a. Pressure switch b. Compressor motor Voltage regulator

  12. Comparative intelligibility investigation of single-channel noise-reduction algorithms for Chinese, Japanese, and English.

    PubMed

    Li, Junfeng; Yang, Lin; Zhang, Jianping; Yan, Yonghong; Hu, Yi; Akagi, Masato; Loizou, Philipos C

    2011-05-01

    A large number of single-channel noise-reduction algorithms have been proposed based largely on mathematical principles. Most of these algorithms, however, have been evaluated with English speech. Given the different perceptual cues used by native listeners of different languages including tonal languages, it is of interest to examine whether there are any language effects when the same noise-reduction algorithm is used to process noisy speech in different languages. A comparative evaluation and investigation is taken in this study of various single-channel noise-reduction algorithms applied to noisy speech taken from three languages: Chinese, Japanese, and English. Clean speech signals (Chinese words and Japanese words) were first corrupted by three types of noise at two signal-to-noise ratios and then processed by five single-channel noise-reduction algorithms. The processed signals were finally presented to normal-hearing listeners for recognition. Intelligibility evaluation showed that the majority of noise-reduction algorithms did not improve speech intelligibility. Consistent with a previous study with the English language, the Wiener filtering algorithm produced small, but statistically significant, improvements in intelligibility for car and white noise conditions. Significant differences between the performances of noise-reduction algorithms across the three languages were observed.

  13. 40 CFR 88.305-94 - Clean-fuel fleet vehicle labeling requirements for heavy-duty vehicles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... requirements for heavy-duty vehicles. 88.305-94 Section 88.305-94 Protection of Environment ENVIRONMENTAL...-94 Clean-fuel fleet vehicle labeling requirements for heavy-duty vehicles. (a) All clean-fuel heavy... LEV, ULEV, or ZEV, and meets all of the applicable requirements of this part 88. (b) All heavy-duty...

  14. Incrementing data quality of multi-frequency echograms using the Adaptive Wiener Filter (AWF) denoising algorithm

    NASA Astrophysics Data System (ADS)

    Peña, M.

    2016-10-01

    Achieving acceptable signal-to-noise ratio (SNR) can be difficult when working in sparsely populated waters and/or when species have low scattering such as fluid filled animals. The increasing use of higher frequencies and the study of deeper depths in fisheries acoustics, as well as the use of commercial vessels, is raising the need to employ good denoising algorithms. The use of a lower Sv threshold to remove noise or unwanted targets is not suitable in many cases and increases the relative background noise component in the echogram, demanding more effectiveness from denoising algorithms. The Adaptive Wiener Filter (AWF) denoising algorithm is presented in this study. The technique is based on the AWF commonly used in digital photography and video enhancement. The algorithm firstly increments the quality of the data with a variance-dependent smoothing, before estimating the noise level as the envelope of the Sv minima. The AWF denoising algorithm outperforms existing algorithms in the presence of gaussian, speckle and salt & pepper noise, although impulse noise needs to be previously removed. Cleaned echograms present homogenous echotraces with outlined edges.

  15. Development and application of a Chinese webpage suicide information mining system (sims).

    PubMed

    Chen, Penglai; Chai, Jing; Zhang, Lu; Wang, Debin

    2014-11-01

    This study aims at designing and piloting a convenient Chinese webpage suicide information mining system (SIMS) to help search and filter required data from the internet and discover potential features and trends of suicide. SIMS utilizes Microsoft Visual Studio2008, SQL2008 and C# as development tools. It collects webpage data via popular search engines; cleans the data using trained models plus minimum manual help; translates the cleaned texts into quantitative data through models and supervised fuzzy recognition; analyzes and visualizes related variables by self-programmed algorithms. The SIMS developed comprises such functions as suicide news and blogs collection, data filtering, cleaning, extraction and translation, data analysis and presentation. SIMS-mediated mining of one-year webpage revealed that: peak months and hours of web-reported suicide events were June-July and 10-11 am respectively, and the lowest months and hours, September-October and 1-7 am; suicide reports came mostly from Soho, Tecent, Sina etc.; male suicide victims over counted female victims in most sub-regions but southwest China; homes, public places and rented houses were the top three places to commit suicide; poisoning, cutting vein and jumping from building were the most commonly used methods to commit suicide; love disputes, family disputes and mental diseases were the leading causes. SIMS provides a preliminary and supplementary means for monitoring and understanding suicide. It proposes useful aspects as well as tools for analyzing the features and trends of suicide using data derived from Chinese webpages. Yet given the intrinsic "dual nature" of internet-based suicide information and the tremendous difficulties experienced by ourselves and other researchers, there is still a long way to go for us to expand, refine and evaluate the system.

  16. Financing CHP Projects at Wastewater Treatment Facilities with Clean Water State Revolving Funds

    EPA Pesticide Factsheets

    This factsheet provides information about CHP at wastewater treatment facilities, including applications, financial challenges, and financial opportunities, such as the Clean Water State Revolving Fund.

  17. Physicochemical cleaning and recovery of coal

    NASA Astrophysics Data System (ADS)

    Wheelock, T. D.

    1982-03-01

    The development and demonstration of a method of depressing iron pyrites which is applicable to both the froth flotation and oil agglomeration methods of cleaning and recoverying fine-size coal are described.

  18. Development of a bar code-based exposure assessment method to evaluate occupational exposure to disinfectants and cleaning products: a pilot study.

    PubMed

    Quinot, Catherine; Amsellem-Dubourget, Sylvie; Temam, Sofia; Sevin, Etienne; Barreto, Christine; Tackin, Arzu; Félicité, Jérémy; Lyon-Caen, Sarah; Siroux, Valérie; Girard, Raphaële; Descatha, Alexis; Le Moual, Nicole; Dumas, Orianne

    2018-05-14

    Healthcare workers are highly exposed to various types of disinfectants and cleaning products. Assessment of exposure to these products remains a challenge. We aimed to investigate the feasibility of a method, based on a smartphone application and bar codes, to improve occupational exposure assessment among hospital/cleaning workers in epidemiological studies. A database of disinfectants and cleaning products used in French hospitals, including their names, bar codes and composition, was developed using several sources: ProdHyBase (a database of disinfectants managed by hospital hygiene experts), and specific regulatory agencies and industrial websites. A smartphone application has been created to scan bar codes of products and fill a short questionnaire. The application was tested in a French hospital. The ease of use and the ability to record information through this new approach were estimated. The method was tested in a French hospital (7 units, 14 participants). Through the application, 126 records (one record referred to one product entered by one participant/unit) were registered, majority of which were liquids (55.5%) or sprays (23.8%); 20.6% were used to clean surfaces and 15.9% to clean toilets. Workers used mostly products with alcohol and quaternary ammonium compounds (>90% with weekly use), followed by hypochlorite bleach and hydrogen peroxide (28.6%). For most records, information was available on the name (93.7%) and bar code (77.0%). Information on product compounds was available for all products and recorded in the database. This innovative and easy-to-use method could help to improve the assessment of occupational exposure to disinfectants/cleaning products in epidemiological studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Solid Lubrication Fundamentals and Applications. Properties of Clean Surfaces: Adhesion, Friction, and Wear

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    1998-01-01

    This chapter presents the adhesion, friction, and wear behaviors of smooth, atomically clean surfaces of solid-solid couples, such as metal-ceramic couples, in a clean environment. Surface and bulk properties, which determine the adhesion, friction, and wear behaviors of solid-solid couples, are described. The primary emphasis is on the nature and character of the metal, especially its surface energy and ductility. Also, the mechanisms of friction and wear for clean, smooth surfaces are stated.

  20. Colorado SIP: 5 CCR 1001-13, Reg 11, Motor Vehicle Emissions Inspection Program—Part A, General Provisions, Area of Applicability, Schedules for Obtaining Certification of Emissions Control, Definitions, Exemptions, and Clean Screening/Remote Sensing

    EPA Pesticide Factsheets

    Colorado SIP: 5 CCR 1001-13, Reg 11, Motor Vehicle Emissions Inspection Program—Part A, General Provisions, Area of Applicability, Schedules for Obtaining Certification of Emissions Control, Definitions, Exemptions, and Clean Screening/Remote Sensing

  1. Predicting Activity Energy Expenditure Using the Actical[R] Activity Monitor

    ERIC Educational Resources Information Center

    Heil, Daniel P.

    2006-01-01

    This study developed algorithms for predicting activity energy expenditure (AEE) in children (n = 24) and adults (n = 24) from the Actical[R] activity monitor. Each participant performed 10 activities (supine resting, three sitting, three house cleaning, and three locomotion) while wearing monitors on the ankle, hip, and wrist; AEE was computed…

  2. Empirical scoring functions for advanced protein-ligand docking with PLANTS.

    PubMed

    Korb, Oliver; Stützle, Thomas; Exner, Thomas E

    2009-01-01

    In this paper we present two empirical scoring functions, PLANTS(CHEMPLP) and PLANTS(PLP), designed for our docking algorithm PLANTS (Protein-Ligand ANT System), which is based on ant colony optimization (ACO). They are related, regarding their functional form, to parts of already published scoring functions and force fields. The parametrization procedure described here was able to identify several parameter settings showing an excellent performance for the task of pose prediction on two test sets comprising 298 complexes in total. Up to 87% of the complexes of the Astex diverse set and 77% of the CCDC/Astex clean listnc (noncovalently bound complexes of the clean list) could be reproduced with root-mean-square deviations of less than 2 A with respect to the experimentally determined structures. A comparison with the state-of-the-art docking tool GOLD clearly shows that this is, especially for the druglike Astex diverse set, an improvement in pose prediction performance. Additionally, optimized parameter settings for the search algorithm were identified, which can be used to balance pose prediction reliability and search speed.

  3. Materials compatibility and aging for flux and cleaner combinations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archuleta, Kim M.; Piatt, Rochelle

    2015-01-01

    A materials study of high reliability electronics cleaning is presented here. In Phase 1, mixed type substrates underwent a condensed contaminants application to view a worst- case scenario for unremoved flux with cleaning agent residue for parts in a silicone oil filled environment. In Phase 2, fluxes applied to copper coupons and to printed wiring boards underwent gentle cleaning then accelerated aging in air at 65% humidity and 30 O C. Both sets were aged for 4 weeks. Contaminants were no-clean (ORL0), water soluble (ORH1 liquid and ORH0 paste), and rosin (RMA; ROL0) fluxes. Defluxing agents were water, solvents, andmore » engineered aqueous defluxers. In the first phase, coupons had flux applied and heated, then were placed in vials of oil with a small amount of cleaning agent and additional coupons. In the second phase, pairs of copper coupons and PWB were hand soldered by application of each flux, using tin-lead solder in a strip across the coupon or a set of test components on the PWB. One of each pair was cleaned in each cleaning agent, the first with a typical clean, and the second with a brief clean. Ionic contamination residue was measured before accelerated aging. After aging, substrates were removed and a visual record of coupon damage made, from which a subjective rank was applied for comparison between the various flux and defluxer combinations; more corrosion equated to higher rank. The ORH1 water soluble flux resulted in the highest ranking in both phases, the RMA flux the least. For the first phase, in which flux and defluxer remained on coupons, the aqueous defluxers led to worse corrosion. The vapor phase cleaning agents resulted in the highest ranking in the second phase, in which there was no physical cleaning. Further study of cleaning and rinsing parameters will be required.« less

  4. 40 CFR Appendix B to Subpart G of... - Substitutes Subject to Use Restrictions and Unacceptable Substitutes

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Use Conditions Substitutes Application Substitute Decision Conditions Comments Electronics Cleaning w.... Electronics cleaning w/CFC-113 Dibromomethane Unacceptable High ODP; other alternatives exist. Electronics...

  5. Determining If a Cleaning Product Is a Pesticide Under FIFRA

    EPA Pesticide Factsheets

    Information that describes the Agency’s interpretation of the statutory and regulatory language applicable to products marketed as cleaning products that claim, state or imply that they mitigate a pest.

  6. Longitudinal multiple imputation approaches for body mass index or other variables with very low individual-level variability: the mibmi command in Stata.

    PubMed

    Kontopantelis, Evangelos; Parisi, Rosa; Springate, David A; Reeves, David

    2017-01-13

    In modern health care systems, the computerization of all aspects of clinical care has led to the development of large data repositories. For example, in the UK, large primary care databases hold millions of electronic medical records, with detailed information on diagnoses, treatments, outcomes and consultations. Careful analyses of these observational datasets of routinely collected data can complement evidence from clinical trials or even answer research questions that cannot been addressed in an experimental setting. However, 'missingness' is a common problem for routinely collected data, especially for biological parameters over time. Absence of complete data for the whole of a individual's study period is a potential bias risk and standard complete-case approaches may lead to biased estimates. However, the structure of the data values makes standard cross-sectional multiple-imputation approaches unsuitable. In this paper we propose and evaluate mibmi, a new command for cleaning and imputing longitudinal body mass index data. The regression-based data cleaning aspects of the algorithm can be useful when researchers analyze messy longitudinal data. Although the multiple imputation algorithm is computationally expensive, it performed similarly or even better to existing alternatives, when interpolating observations. The mibmi algorithm can be a useful tool for analyzing longitudinal body mass index data, or other longitudinal data with very low individual-level variability.

  7. Image Reconstruction in Radio Astronomy with Non-Coplanar Synthesis Arrays

    NASA Astrophysics Data System (ADS)

    Goodrick, L.

    2015-03-01

    Traditional radio astronomy imaging techniques assume that the interferometric array is coplanar, with a small field of view, and that the two-dimensional Fourier relationship between brightness and visibility remains valid, allowing the Fast Fourier Transform to be used. In practice, to acquire more accurate data, the non-coplanar baseline effects need to be incorporated, as small height variations in the array plane introduces the w spatial frequency component. This component adds an additional phase shift to the incoming signals. There are two approaches to account for the non-coplanar baseline effects: either the full three-dimensional brightness and visibility model can be used to reconstruct an image, or the non-coplanar effects can be removed, reducing the three dimensional relationship to that of the two-dimensional one. This thesis describes and implements the w-projection and w-stacking algorithms. The aim of these algorithms is to account for the phase error introduced by non-coplanar synthesis arrays configurations, making the recovered visibilities more true to the actual brightness distribution model. This is done by reducing the 3D visibilities to a 2D visibility model. The algorithms also have the added benefit of wide-field imaging, although w-stacking supports a wider field of view at the cost of more FFT bin support. For w-projection, the w-term is accounted for in the visibility domain by convolving it out of the problem with a convolution kernel, allowing the use of the two-dimensional Fast Fourier Transform. Similarly, the w-Stacking algorithm applies a phase correction in the image domain to image layers to produce an intensity model that accounts for the non-coplanar baseline effects. This project considers the KAT7 array for simulation and analysis of the limitations and advantages of both the algorithms. Additionally, a variant of the Högbom CLEAN algorithm was used which employs contour trimming for extended source emission flagging. The CLEAN algorithm is an iterative two-dimensional deconvolution method that can further improve image fidelity by removing the effects of the point spread function which can obscure source data.

  8. Robust sparse image reconstruction of radio interferometric observations with PURIFY

    NASA Astrophysics Data System (ADS)

    Pratley, Luke; McEwen, Jason D.; d'Avezac, Mayeul; Carrillo, Rafael E.; Onose, Alexandru; Wiaux, Yves

    2018-01-01

    Next-generation radio interferometers, such as the Square Kilometre Array, will revolutionize our understanding of the Universe through their unprecedented sensitivity and resolution. However, to realize these goals significant challenges in image and data processing need to be overcome. The standard methods in radio interferometry for reconstructing images, such as CLEAN, have served the community well over the last few decades and have survived largely because they are pragmatic. However, they produce reconstructed interferometric images that are limited in quality and scalability for big data. In this work, we apply and evaluate alternative interferometric reconstruction methods that make use of state-of-the-art sparse image reconstruction algorithms motivated by compressive sensing, which have been implemented in the PURIFY software package. In particular, we implement and apply the proximal alternating direction method of multipliers algorithm presented in a recent article. First, we assess the impact of the interpolation kernel used to perform gridding and degridding on sparse image reconstruction. We find that the Kaiser-Bessel interpolation kernel performs as well as prolate spheroidal wave functions while providing a computational saving and an analytic form. Secondly, we apply PURIFY to real interferometric observations from the Very Large Array and the Australia Telescope Compact Array and find that images recovered by PURIFY are of higher quality than those recovered by CLEAN. Thirdly, we discuss how PURIFY reconstructions exhibit additional advantages over those recovered by CLEAN. The latest version of PURIFY, with developments presented in this work, is made publicly available.

  9. An effective temperature compensation approach for ultrasonic hydrogen sensors

    NASA Astrophysics Data System (ADS)

    Tan, Xiaolong; Li, Min; Arsad, Norhana; Wen, Xiaoyan; Lu, Haifei

    2018-03-01

    Hydrogen is a kind of promising clean energy resource with a wide application prospect, which will, however, cause a serious security issue upon the leakage of hydrogen gas. The measurement of its concentration is of great significance. In a traditional approach of ultrasonic hydrogen sensing, a temperature drift of 0.1 °C results in a concentration error of about 250 ppm, which is intolerable for trace amount of gas sensing. In order to eliminate the influence brought by temperature drift, we propose a feasible approach named as linear compensation algorithm, which utilizes the linear relationship between the pulse count and temperature to compensate for the pulse count error (ΔN) caused by temperature drift. Experimental results demonstrate that our proposed approach is capable of improving the measurement accuracy and can easily detect sub-100 ppm of hydrogen concentration under variable temperature conditions.

  10. Model evaluation of the phytoextraction potential of heavy metal hyperaccumulators and non-hyperaccumulators.

    PubMed

    Liang, Hong-Ming; Lin, Ting-Hsiang; Chiou, Jeng-Min; Yeh, Kuo-Chen

    2009-06-01

    Evaluation of the remediation ability of zinc/cadmium in hyper- and non-hyperaccumulator plant species through greenhouse studies is limited. To bridge the gap between greenhouse studies and field applications for phytoextraction, we used published data to examine the partitioning of heavy metals between plants and soil (defined as the bioconcentration factor). We compared the remediation ability of the Zn/Cd hyperaccumulators Thlaspi caerulescens and Arabidopsis halleri and the non-hyperaccumulators Nicotiana tabacum and Brassica juncea using a hierarchical linear model (HLM). A recursive algorithm was then used to evaluate how many harvest cycles were required to clean a contaminated site to meet Taiwan Environmental Protection Agency regulations. Despite the high bioconcentration factor of both hyperaccumulators, metal removal was still limited because of the plants' small biomass. Simulation with N. tabacum and the Cadmium model suggests further study and development of plants with high biomass and improved phytoextraction potential for use in environmental cleanup.

  11. Audio Tracking in Noisy Environments by Acoustic Map and Spectral Signature.

    PubMed

    Crocco, Marco; Martelli, Samuele; Trucco, Andrea; Zunino, Andrea; Murino, Vittorio

    2018-05-01

    A novel method is proposed for generic target tracking by audio measurements from a microphone array. To cope with noisy environments characterized by persistent and high energy interfering sources, a classification map (CM) based on spectral signatures is calculated by means of a machine learning algorithm. Next, the CM is combined with the acoustic map, describing the spatial distribution of sound energy, in order to obtain a cleaned joint map in which contributions from the disturbing sources are removed. A likelihood function is derived from this map and fed to a particle filter yielding the target location estimation on the acoustic image. The method is tested on two real environments, addressing both speaker and vehicle tracking. The comparison with a couple of trackers, relying on the acoustic map only, shows a sharp improvement in performance, paving the way to the application of audio tracking in real challenging environments.

  12. Application of surface-enhanced Raman spectroscopy (SERS) for cleaning verification in pharmaceutical manufacture.

    PubMed

    Corrigan, Damion K; Cauchi, Michael; Piletsky, Sergey; Mccrossen, Sean

    2009-01-01

    Cleaning verification is the process by which pharmaceutical manufacturing equipment is determined as sufficiently clean to allow manufacture to continue. Surface-enhanced Raman spectroscopy (SERS) is a very sensitive spectroscopic technique capable of detection at levels appropriate for cleaning verification. In this paper, commercially available Klarite SERS substrates were employed in order to obtain the necessary enhancement of signal for the identification of chemical species at concentrations of 1 to 10 ng/cm2, which are relevant to cleaning verification. The SERS approach was combined with principal component analysis in the identification of drug compounds recovered from a contaminated steel surface.

  13. Automatic motion and noise artifact detection in Holter ECG data using empirical mode decomposition and statistical approaches.

    PubMed

    Lee, Jinseok; McManus, David D; Merchant, Sneh; Chon, Ki H

    2012-06-01

    We present a real-time method for the detection of motion and noise (MN) artifacts, which frequently interferes with accurate rhythm assessment when ECG signals are collected from Holter monitors. Our MN artifact detection approach involves two stages. The first stage involves the use of the first-order intrinsic mode function (F-IMF) from the empirical mode decomposition to isolate the artifacts' dynamics as they are largely concentrated in the higher frequencies. The second stage of our approach uses three statistical measures on the F-IMF time series to look for characteristics of randomness and variability, which are hallmark signatures of MN artifacts: the Shannon entropy, mean, and variance. We then use the receiver-operator characteristics curve on Holter data from 15 healthy subjects to derive threshold values associated with these statistical measures to separate between the clean and MN artifacts' data segments. With threshold values derived from 15 training data sets, we tested our algorithms on 30 additional healthy subjects. Our results show that our algorithms are able to detect the presence of MN artifacts with sensitivity and specificity of 96.63% and 94.73%, respectively. In addition, when we applied our previously developed algorithm for atrial fibrillation (AF) detection on those segments that have been labeled to be free from MN artifacts, the specificity increased from 73.66% to 85.04% without loss of sensitivity (74.48%-74.62%) on six subjects diagnosed with AF. Finally, the computation time was less than 0.2 s using a MATLAB code, indicating that real-time application of the algorithms is possible for Holter monitoring.

  14. Comparison of genetic algorithm and imperialist competitive algorithms in predicting bed load transport in clean pipe.

    PubMed

    Ebtehaj, Isa; Bonakdari, Hossein

    2014-01-01

    The existence of sediments in wastewater greatly affects the performance of the sewer and wastewater transmission systems. Increased sedimentation in wastewater collection systems causes problems such as reduced transmission capacity and early combined sewer overflow. The article reviews the performance of the genetic algorithm (GA) and imperialist competitive algorithm (ICA) in minimizing the target function (mean square error of observed and predicted Froude number). To study the impact of bed load transport parameters, using four non-dimensional groups, six different models have been presented. Moreover, the roulette wheel selection method is used to select the parents. The ICA with root mean square error (RMSE) = 0.007, mean absolute percentage error (MAPE) = 3.5% show better results than GA (RMSE = 0.007, MAPE = 5.6%) for the selected model. All six models return better results than the GA. Also, the results of these two algorithms were compared with multi-layer perceptron and existing equations.

  15. 40 CFR 122.36 - As an operator of a regulated small MS4, what happens if I don't comply with the application or...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) WATER PROGRAMS EPA ADMINISTERED PERMIT PROGRAMS: THE NATIONAL POLLUTANT DISCHARGE ELIMINATION SYSTEM... and penalties described in Clean Water Act sections 309 (b), (c), and (g) and 505, or under applicable State, Tribal, or local law. Compliance with a permit issued pursuant to section 402 of the Clean Water...

  16. 40 CFR 122.36 - As an operator of a regulated small MS4, what happens if I don't comply with the application or...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) WATER PROGRAMS EPA ADMINISTERED PERMIT PROGRAMS: THE NATIONAL POLLUTANT DISCHARGE ELIMINATION SYSTEM... and penalties described in Clean Water Act sections 309 (b), (c), and (g) and 505, or under applicable State, Tribal, or local law. Compliance with a permit issued pursuant to section 402 of the Clean Water...

  17. Development and test results of a flight management algorithm for fuel conservative descents in a time-based metered traffic environment

    NASA Technical Reports Server (NTRS)

    Knox, C. E.; Cannon, D. G.

    1980-01-01

    A simple flight management descent algorithm designed to improve the accuracy of delivering an airplane in a fuel-conservative manner to a metering fix at a time designated by air traffic control was developed and flight tested. This algorithm provides a three dimensional path with terminal area time constraints (four dimensional) for an airplane to make an idle thrust, clean configured (landing gear up, flaps zero, and speed brakes retracted) descent to arrive at the metering fix at a predetermined time, altitude, and airspeed. The descent path was calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard pressure and temperature effects. The flight management descent algorithm is described. The results of the flight tests flown with the Terminal Configured Vehicle airplane are presented.

  18. Clean Sampling of an Englacial Conduit at Blood Falls, Antarctica - Some Experimental and Numerical Results

    NASA Astrophysics Data System (ADS)

    Kowalski, Julia; Francke, Gero; Feldmann, Marco; Espe, Clemens; Heinen, Dirk; Digel, Ilya; Clemens, Joachim; Schüller, Kai; Mikucki, Jill; Tulaczyk, Slawek M.; Pettit, Erin; Berry Lyons, W.; Dachwald, Bernd

    2017-04-01

    There is significant interest in sampling subglacial environments for geochemical and microbiological studies, yet those environments are typically difficult to access. Existing ice-drilling technologies make it cumbersome to maintain microbiologically clean access for sample acquisition and environmental stewardship of potentially fragile subglacial aquatic ecosystems. With the "IceMole", a minimally invasive, maneuverable subsurface ice probe, we have developed a clean glacial exploration technology for in-situ analysis and sampling of glacial ice and sub- and englacial materials. Its design is based on combining melting and mechanical stabilization, using an ice screw at the tip of the melting head to maintain firm contact between the melting head and the ice. The IceMole can change its melting direction by differential heating of the melting head and optional side wall heaters. Downward, horizontal and upward melting, as well as curve driving and penetration of particulate-ladden layers has already been demonstrated in several field tests. This maneuverability of the IceMole also necessitates a sophisticated on-board navigation system, capable of autonomous operations. Therefore, between 2012 and 2014, a more advanced probe was developed as part of the "Enceladus Explorer" (EnEx) project. The EnEx-IceMole offers systems for accurate positioning, based on in-ice attitude determination, acoustic positioning, ultrasonic obstacle and target detection, which is all integrated through a high-level sensor fusion algorithm. In December 2014, the EnEx-IceMole was used for clean access into a unique subglacial aquatic environment at Blood Falls, Antarctica, where an englacial brine sample was successfully obtained after about 17 meters of oblique melting. Particular attention was paid to clean protocols for sampling for geochemical and microbiological analysis. In this contribution, we will describe the general technological approach of the IceMole and report on the results of its deployment at Blood Falls. In contrast to conventional melting-probe applications, which can only melt vertically, the IceMole realized an oblique melting path to penetrate the englacial conduit. Experimental and numerical results on melting at oblique angles are rare. Besides reporting on the IceMole technology and the field deployment itself, we will compare and discuss the observed melting behavior with re-analysis results in the context of a recently developed numerical model. Finally, we will present our first steps in utilizing the model to infer on the ambient cryo-environment.

  19. One-step fabrication of robust superhydrophobic and superoleophilic surfaces with self-cleaning and oil/water separation function.

    PubMed

    Zhang, Zhi-Hui; Wang, Hu-Jun; Liang, Yun-Hong; Li, Xiu-Juan; Ren, Lu-Quan; Cui, Zhen-Quan; Luo, Cheng

    2018-03-01

    Superhydrophobic surfaces have great potential for application in self-cleaning and oil/water separation. However, the large-scale practical applications of superhydrophobic coating surfaces are impeded by many factors, such as complicated fabrication processes, the use of fluorinated reagents and noxious organic solvents and poor mechanical stability. Herein, we describe the successful preparation of a fluorine-free multifunctional coating without noxious organic solvents that was brushed, dipped or sprayed onto glass slides and stainless-steel meshes as substrates. The obtained multifunctional superhydrophobic and superoleophilic surfaces (MSHOs) demonstrated self-cleaning abilities even when contaminated with or immersed in oil. The superhydrophobic surfaces were robust and maintained their water repellency after being scratched with a knife or abraded with sandpaper for 50 cycles. In addition, stainless-steel meshes sprayed with the coating quickly separated various oil/water mixtures with a high separation efficiency (>93%). Furthermore, the coated mesh maintained a high separation efficiency above 95% over 20 cycles of separation. This simple and effective strategy will inspire the large-scale fabrication of multifunctional surfaces for practical applications in self-cleaning and oil/water separation.

  20. Nonflammable, Nonaqueous, Low Atmospheric Impact, High Performance Cleaning Solvents

    NASA Technical Reports Server (NTRS)

    Dhooge, P. M.; Glass, S. M.; Nimitz, J. S.

    2001-01-01

    For many years, chlorofluorocarbon (CFC) and chlorocarbon solvents have played an important part in aerospace operations. These solvents found extensive use as cleaning and analysis (EPA) solvents in precision and critical cleaning. However, CFCs and chlorocarbon solvents have deleterious effects on the ozone layer, are relatively strong greenhouse gases, and some are suspect or known carcinogens. Because of their ozone-depletion potential (ODP), the Montreal Protocol and its amendments, as well as other environmental regulations, have resulted in the phaseout of CFC-113 and 1,1,1-trichloroethane (TCA). Although alternatives have been recommended, they do not perform as well as the original solvents. In addition, some analyses, such as the infrared analysis of extracted hydrocarbons, cannot be performed with the substitute solvents that contain C-H bonds. CFC-113 solvent has been used for many critical aerospace applications. CFC-113, also known as Freon (registered) TF, has been used extensively in NASA's cleaning facilities for precision and critical cleaning, in particular the final rinsing in Class 100 areas, with gas chromatography analysis of rinse residue. While some cleaning can be accomplished by other processes, there are certain critical applications where CFC-113 or a similar solvent is highly cost-effective and ensures safety. Oxygen system components are one example where a solvent compatible with oxygen and capable of removing fluorocarbon grease is needed. Electronic components and precision mechanical components can also be damaged by aggressive cleaning solvents.

  1. 40 CFR 471.42 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Cyanide 0.179 0.074 Silver 0.253 0.105 (p) Alkaline cleaning spent baths. Subpart D—BAT Pollutant or...-pounds) of precious metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 (q) Alkaline cleaning rinse. Subpart D—BAT Pollutant or pollutant property Maximum for...

  2. 40 CFR 471.41 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... at all times. (p) Alkaline cleaning spent baths. Subpart D—BPT Pollutant or pollutant property... metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 Oil...) Alkaline cleaning rinse. Subpart D—BPT Pollutant or pollutant property Maximum for any 1 day Maximum for...

  3. 40 CFR 471.52 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 320 Molybdenum 60.9 27.0 (n) Alkaline cleaning spent baths. Subpart E—BAT Pollutant or pollutant... refractory metals alkaline cleaned Copper 0.428 0.204 Nickel 0.184 0.124 Fluoride 19.9 8.82 Molybdenum 1.68 0.745 (o) Alkaline cleaning rinse. Subpart E—BAT Pollutant or pollutant property Maximum for any 1 day...

  4. 40 CFR 471.41 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... at all times. (p) Alkaline cleaning spent baths. Subpart D—BPT Pollutant or pollutant property... metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 Oil...) Alkaline cleaning rinse. Subpart D—BPT Pollutant or pollutant property Maximum for any 1 day Maximum for...

  5. 40 CFR 471.42 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Cyanide 0.179 0.074 Silver 0.253 0.105 (p) Alkaline cleaning spent baths. Subpart D—BAT Pollutant or...-pounds) of precious metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 (q) Alkaline cleaning rinse. Subpart D—BAT Pollutant or pollutant property Maximum for...

  6. 40 CFR 471.82 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....151 (j) Alkaline cleaning spent baths. Subpart H—BAT Pollutant or pollutant property Maximum for any 1 day Maximum for monthly average mg/off-kg (pounds per million off-pounds) of zinc alkaline cleaned Chromium 0.002 0.0006 Copper 0.005 0.002 Cyanide 0.0007 0.0003 Zinc 0.004 0.002 (k) Alkaline cleaning rinse...

  7. 40 CFR 471.82 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....072 0.029 Zinc 0.365 0.151 (j) Alkaline cleaning spent baths. Subpart H—BAT Pollutant or pollutant... zinc alkaline cleaned Chromium 0.002 0.0006 Copper 0.005 0.002 Cyanide 0.0007 0.0003 Zinc 0.004 0.002 (k) Alkaline cleaning rinse. Subpart H—BAT Pollutant or pollutant property Maximum for any 1 day...

  8. 40 CFR 471.42 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cyanide 0.179 0.074 Silver 0.253 0.105 (p) Alkaline cleaning spent baths. Subpart D—BAT Pollutant or...-pounds) of precious metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 (q) Alkaline cleaning rinse. Subpart D—BAT Pollutant or pollutant property Maximum for...

  9. 40 CFR 471.41 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... at all times. (p) Alkaline cleaning spent baths. Subpart D—BPT Pollutant or pollutant property... metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 Oil...) Alkaline cleaning rinse. Subpart D—BPT Pollutant or pollutant property Maximum for any 1 day Maximum for...

  10. 40 CFR 471.52 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 320 Molybdenum 60.9 27.0 (n) Alkaline cleaning spent baths. Subpart E—BAT Pollutant or pollutant... refractory metals alkaline cleaned Copper 0.428 0.204 Nickel 0.184 0.124 Fluoride 19.9 8.82 Molybdenum 1.68 0.745 (o) Alkaline cleaning rinse. Subpart E—BAT Pollutant or pollutant property Maximum for any 1 day...

  11. 40 CFR 471.51 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... range of 7.5 to 10.0 at all times. (n) Alkaline cleaning spent baths. Subpart E—BPT Pollutant or...-pounds) of refractory metals alkaline cleaned Copper 0.635 0.334 Nickel 0.641 0.424 Fluoride 19.9 8.82... all times. (o) Alkaline cleaning rinse. Subpart E—BPT Pollutant or pollutant property Maximum for any...

  12. 40 CFR 471.82 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....151 (j) Alkaline cleaning spent baths. Subpart H—BAT Pollutant or pollutant property Maximum for any 1 day Maximum for monthly average mg/off-kg (pounds per million off-pounds) of zinc alkaline cleaned Chromium 0.002 0.0006 Copper 0.005 0.002 Cyanide 0.0007 0.0003 Zinc 0.004 0.002 (k) Alkaline cleaning rinse...

  13. 40 CFR 471.82 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....072 0.029 Zinc 0.365 0.151 (j) Alkaline cleaning spent baths. Subpart H—BAT Pollutant or pollutant... zinc alkaline cleaned Chromium 0.002 0.0006 Copper 0.005 0.002 Cyanide 0.0007 0.0003 Zinc 0.004 0.002 (k) Alkaline cleaning rinse. Subpart H—BAT Pollutant or pollutant property Maximum for any 1 day...

  14. 40 CFR 471.41 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... at all times. (p) Alkaline cleaning spent baths. Subpart D—BPT Pollutant or pollutant property... metals alkaline cleaned Cadmium 0.021 0.009 Copper 0.114 0.060 Cyanide 0.018 0.007 Silver 0.025 0.010 Oil...) Alkaline cleaning rinse. Subpart D—BPT Pollutant or pollutant property Maximum for any 1 day Maximum for...

  15. The Chandra Source Catalog: Algorithms

    NASA Astrophysics Data System (ADS)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  16. Evaluation of Solvent Alternatives for Cleaning of Oxygen Systems

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Biesinger, Paul; Delgado, Rafael; Antin, Neil

    1999-01-01

    The NASA White Sands Test Facility (WSTF) in a joint program with the Naval Sea Systems Command has evaluated a number of solvents as alternatives to the use of chlorofluorocarbons currently utilized for cleaning of oxygen systems. Particular attention has been given to the cleaning of gauges and instrumentation used in oxygen service, since there have been no identified aqueous alternatives. The requirements identified as selection criteria, include toxicity, physical properties consistent with application, flammability, oxygen compatibility, and cleaning ability. This paper provides a summary of results and recommendations for solvents evaluated to date.

  17. FY 2017 Center Innovation Fund Annual Report - Highlights/Abstract section

    NASA Technical Reports Server (NTRS)

    Hintze, Paul; Youngquist, Robert C.; Massa, Gioia D.; Meier, Anne J.

    2017-01-01

    This project evaluated the feasibility of low pressure cold plasma (CP) for two applications: disinfection of produce grown in space and sterilization of medical equipment in space. Currently there is no ISS capability for disinfecting pick and eat crops, food utensils, food production areas, or medical devices. This deficit is extended to projected long duration missions. Small, portable, cold plasma devices would provide an enhanced benefit to crew health and address issues concerning microbial cross contamination. The technology would contribute to the reduction of solid waste since currently crews utilize benzalkonium chloride wet wipes for cleaning surfaces and might use PRO-SAN wipes for cleaning vegetables. CP cleaning/disinfection/sterilization can work on many surfaces, including all metals, most polymers, and this project evaluated produce. Therefore CP provides a simple system that has many different cleaning application in space: produce, medical equipment, cutlery, miscellaneous tools.

  18. ShellFit: Reconstruction in the MiniCLEAN Detector

    NASA Astrophysics Data System (ADS)

    Seibert, Stanley

    2010-02-01

    The MiniCLEAN dark matter experiment is an ultra-low background liquid cryogen detector with a fiducial volume of approximately 150 kg. Dark matter candidate events produce ultraviolet scintillation light in argon at 128 nm and in neon at 80 nm. In order to detect this scintillation light, the target volume is enclosed by acrylic plates forming a spherical shell upon which an organic fluor, tetraphenyl butadiene (TPB), has been applied. TPB absorbs UV light and reemits visible light isotropically which can be detected by photomultiplier tubes. Two significant sources of background events in MiniCLEAN are decays of radon daughters embedded in the acrylic surface and external sources of neutrons, such as the photomultiplier tubes themselves. Both of these backgrounds can be mitigated by reconstructing the origin of the scintillation light and cutting events beyond a particular radius. The scrambling of photon trajectories at the TPB surface makes this task very challenging. The ``ShellFit'' algorithm for reconstructing event position and energy in a detector with a spherical wavelength-shifting shell will be described. The performance of ShellFit will be demonstrated using Monte Carlo simulation of several event types in the MiniCLEAN detector. )

  19. CO2 (dry ice) cleaning system

    NASA Technical Reports Server (NTRS)

    Barnett, Donald M.

    1995-01-01

    Tomco Equipment Company has participated in the dry ice (solid carbon dioxide, CO2) cleaning industry for over ten years as a pioneer in the manufacturer of high density, dry ice cleaning pellet production equipment. For over four years Tomco high density pelletizers have been available to the dry ice cleaning industry. Approximately one year ago Tomco introduced the DI-250, a new dry ice blast unit making Tomco a single source supplier for sublimable media, particle blast, cleaning systems. This new blast unit is an all pneumatic, single discharge hose device. It meters the insertion of 1/8 inch diameter (or smaller), high density, dry ice pellets into a high pressure, propellant gas stream. The dry ice and propellant streams are controlled and mixed from the blast cabinet. From there the mixture is transported to the nozzle where the pellets are accelerated to an appropriate blasting velocity. When directed to impact upon a target area, these dry ice pellets have sufficient energy to effectively remove most surface coatings through dry, abrasive contact. The meta-stable, dry ice pellets used for CO2 cleaning, while labeled 'high density,' are less dense than alternate, abrasive, particle blast media. In addition, after contacting the target surface, they return to their equilibrium condition: a superheated gas state. Most currently used grit blasting media are silicon dioxide based, which possess a sharp tetrahedral molecular structure. Silicon dioxide crystal structures will always produce smaller sharp-edged replicas of the original crystal upon fracture. Larger, softer dry ice pellets do not share the same sharp-edged crystalline structures as their non-sublimable counterparts when broken. In fact, upon contact with the target surface, dry ice pellets will plastically deform and break apart. As such, dry ice cleaning is less harmful to sensitive substrates, workers and the environment than chemical or abrasive cleaning systems. Dry ice cleaning system components include: a dry ice pellet supply, a non-reactive propellant gas source, a pellet and propellant metering device, and a media transport and acceleration hose and nozzle arrangement. Dry ice cleaning system operating parameters include: choice of propellant gas, its pressure and temperature, dry ice mass flow rate, dry ice pellet size and shape, and acceleration nozzle configuration. These parameters may be modified to fit different applications. The growth of the dry ice cleaning industry will depend upon timely data acquisition of the effects that independent changes in these parameters have on cleaning rates, with respect to different surface coating and substrate combinations. With this data, optimization of cleaning rates for particular applications will be possible. The analysis of the applicable range of modulation of these parameters, within system component mechanical constraints, has just begun.

  20. Principles of Sterilization of Mars Descent Vehicle Elements

    NASA Astrophysics Data System (ADS)

    Trofimov, Vladislav; Deshevaya, Elena; Khamidullina, N.; Kalashnikov, Viktor

    Due to COSPAR severe requirements to permissible microbiological contamination of elements of down-to-Mars S/C as well as complexity of their chemical composition and structure the exposure of such S/C elements to antimicrobial treatment (sterilization) at their integration requires application of a wide set of methods: chemical, ultraviolet, radiation. The report describes the analysis of all the aspects of applicable methods of treatment for cleaning of elements’ surfaces and inner contents from microbiota. The analysis showed that the most important, predictable and controllable method is radiation processing (of the elements which don’t change their properties after effective treatment). The experience of ionizing radiation application for sterilization of products for medicine, etc. shows that, depending on initial microbial contamination of lander elements, the required absorbed dose can be within the range 12 ÷ 35 kGr. The analysis of the effect of irregularity of radiation absorption in complex structure elements to the choice of radiation methodology was made and the algorithm of the choice of effective conditions of radiation treatment and control of sterilization efficiency was suggested. The important phase of establishing of the effective condition of each structure element treatment is experimental verification of real microbiological contamination in terms of S/C integration, contamination maximum decrease using another cleaning procedures (mechanical, chemical, ultraviolet) and determination of radiation resistance of spore microorganisms typical for the shops of space technology manufacturing and assembling. Proceeding from three parameters (irregularity of radiation absorption in a concrete element, its initial microbial contamination and resistance of microorganisms to the effect of radiation) the condition of the packed object sterilization is chosen, the condition that prevents secondary contamination, ensures given reliability of the treatment without final experimental microbiological verification only by simple control of the absorbed dose at critical points. All the process phases (from the choice of treatment conditions to provision of the procedure safety) are strictly regulated by Russian legislation in accordance with international standards.

  1. Challenges and Development of a Multi-Scale Computational Model for Photosystem I Decoupled Energy Conversion

    DTIC Science & Technology

    2013-06-01

    Applications of Molecular Modeling to Challenges in Clean Energy; Fitzgerald, G., et al .; ACS Symposium Series; American Chemical Society: Washington, DC...to 178 In Applications of Molecular Modeling to Challenges in Clean Energy; Fitzgerald, G., et al .; ACS Symposium Series; American Chemical Society...Washington, DC, 2013. developmodels of spectral properties and energy transfer kinetics (20–22). Ivashin et al . optimized select ligands (α

  2. Alternative Solvents through Green Chemistry Project

    NASA Technical Reports Server (NTRS)

    Hintze, Paul E.; Quinn, Jacqueline

    2014-01-01

    Components in the aerospace industry must perform with accuracy and precision under extreme conditions, and surface contamination can be detrimental to the desired performance, especially in cases when the components come into contact with strong oxidizers such as liquid oxygen. Therefore, precision cleaning is an important part of a components preparation prior to utilization in aerospace applications. Current cleaning technologies employ a variety of cleaning agents, many of which are halogenated solvents that are either toxic or cause environmental damage. Thus, this project seeks to identify alternative precision cleaning solvents and technologies, including use of less harmful cleaning solvents, ultrasonic and megasonic agitation, low-pressure plasma cleaning techniques, and supercritical carbon dioxide extraction. Please review all data content found in the Public Data tab located at: https:techport.nasa.govview11697public

  3. 40 CFR 263.31 - Discharge clean up.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... hazardous waste discharge no longer presents a hazard to human health or the environment. ....31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS APPLICABLE TO TRANSPORTERS OF HAZARDOUS WASTE Hazardous Waste Discharges § 263.31 Discharge clean...

  4. 40 CFR 263.31 - Discharge clean up.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... hazardous waste discharge no longer presents a hazard to human health or the environment. ....31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS APPLICABLE TO TRANSPORTERS OF HAZARDOUS WASTE Hazardous Waste Discharges § 263.31 Discharge clean...

  5. 40 CFR 263.31 - Discharge clean up.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... hazardous waste discharge no longer presents a hazard to human health or the environment. ....31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS APPLICABLE TO TRANSPORTERS OF HAZARDOUS WASTE Hazardous Waste Discharges § 263.31 Discharge clean...

  6. 40 CFR 263.31 - Discharge clean up.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... hazardous waste discharge no longer presents a hazard to human health or the environment. ....31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS APPLICABLE TO TRANSPORTERS OF HAZARDOUS WASTE Hazardous Waste Discharges § 263.31 Discharge clean...

  7. 40 CFR 263.31 - Discharge clean up.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... hazardous waste discharge no longer presents a hazard to human health or the environment. ....31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) STANDARDS APPLICABLE TO TRANSPORTERS OF HAZARDOUS WASTE Hazardous Waste Discharges § 263.31 Discharge clean...

  8. PERFORMANCE TESTING OF AIR CLEANING PRODUCTS

    EPA Science Inventory

    The paper discuses the application of the Environmental Technology Verification (ETV) Program for products that clean ventilation air to the problem of protecting buildings from chemical and biological attack. This program is funded by the U.S. Environmental Protection Agency und...

  9. Magnetic pulse cleaning of products

    NASA Astrophysics Data System (ADS)

    Smolentsev, V. P.; Safonov, S. V.; Smolentsev, E. V.; Fedonin, O. N.

    2016-04-01

    The article deals with the application of a magnetic impact for inventing new equipment and methods of cleaning cast precision blanks from fragile or granular thickened surface coatings, which are difficult to remove and highly resistant to further mechanical processing. The issues relating to a rational use of the new method for typical products and auxiliary operations have been studied. The calculation and design methods have been elaborated for load-carrying elements of the equipment created. It has been shown, that the application of the magnetic pulse method, combined with a low-frequency vibration process is perspective at enterprises of general and special machine construction, for cleaning lightweight blanks and containers, used for transporting bulk goods.

  10. 40 CFR 471.81 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... range of 7.5 to 10.0 at all times. (j) Alkaline cleaning spent baths. Subpart H—BPT Pollutant or...-pounds) of zinc alkaline cleaned Chromium 0.002 0.0007 Copper 0.007 0.004 Cyanide 0.001 0.0004 Zinc 0.005... times. (k) Alkaline cleaning rinse. Subpart H—BPT Pollutant or pollutant property Maximum for any 1 day...

  11. 40 CFR 471.81 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... range of 7.5 to 10.0 at all times. (j) Alkaline cleaning spent baths. Subpart H—BPT Pollutant or...-pounds) of zinc alkaline cleaned Chromium 0.002 0.0007 Copper 0.007 0.004 Cyanide 0.001 0.0004 Zinc 0.005... times. (k) Alkaline cleaning rinse. Subpart H—BPT Pollutant or pollutant property Maximum for any 1 day...

  12. 40 CFR 471.81 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... range of 7.5 to 10.0 at all times. (j) Alkaline cleaning spent baths. Subpart H—BPT Pollutant or...-pounds) of zinc alkaline cleaned Chromium 0.002 0.0007 Copper 0.007 0.004 Cyanide 0.001 0.0004 Zinc 0.005... times. (k) Alkaline cleaning rinse. Subpart H—BPT Pollutant or pollutant property Maximum for any 1 day...

  13. 40 CFR 471.61 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Within the range of 7.5 to 10.0 at all times. (p) Alkaline cleaning spent baths. Subpart F—BPT Pollutant... off-pounds) of titanium alkaline cleaned Cyanide 0.070 0.029 Lead 0.101 0.048 Zinc 0.351 0.147 Ammonia....5 to 10.0 at all times. (q) Alkaline cleaning rinse. Subpart F—BPT Pollutant or pollutant property...

  14. Cleaning efficiency enhancement by ultrasounds for membranes used in dairy industries.

    PubMed

    Luján-Facundo, M J; Mendoza-Roca, J A; Cuartas-Uribe, B; Álvarez-Blanco, S

    2016-11-01

    Membrane cleaning is a key point for the implementation of membrane technologies in the dairy industry for proteins concentration. In this study, four ultrafiltration (UF) membranes with different molecular weight cut-offs (MWCOs) (5, 15, 30 and 50kDa) and materials (polyethersulfone and ceramics) were fouled with three different whey model solutions: bovine serum albumin (BSA), BSA plus CaCl2 and whey protein concentrate solution (Renylat 45). The purpose of the study was to evaluate the effect of ultrasounds (US) on the membrane cleaning efficiency. The influence of ultrasonic frequency and the US application modes (submerging the membrane module inside the US bath or applying US to the cleaning solution) were also evaluated. The experiments were performed in a laboratory plant which included the US equipment and the possibility of using two membrane modules (flat sheet and tubular). The fouling solution that caused the highest fouling degree for all the membranes was Renylat 45. Results demonstrated that membrane cleaning with US was effective and this effectiveness increased at lower frequencies. Although no significant differences were observed between the two different US applications modes tested, slightly higher cleaning efficiencies values placing the membrane module at the bottom of the tank were achieved. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. The applied technologies to access clean water for remote communities

    NASA Astrophysics Data System (ADS)

    Rabindra, I. B.

    2018-01-01

    A lot of research is done to overcome the remote communities to access clean water, yet very little is utilized and implemented by the community. Various reasons can probably be made for, which is the application of research results is assessed less practical. The aims of this paper is seeking a practical approach, how to establish criteria for the design can be easier applied, at the proper locations, the simple construction, effectively producing a volume and quality of clean water designation. The methods used in this paper is a technological model assessment of treatment/filtering clean water produced a variety of previous research, to establish a model of appropriate technology for remote communities. Various research results collected from the study of literature, while the identification of opportunities and threats to its application is done using a SWOT analysis. This article discussion is looking for alternative models of clean water filtration technology from the previous research results, to be selected as appropriate technology, easily applied and bring of many benefits to the remote communities. The conclusions resulting from the discussion in this paper, expected to be used as the basic criteria of design model of clean water filtration technologies that can be accepted and applied effectively by the remote communities.

  16. Northwest Region Clean Energy Application Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sjoding, David

    2013-09-30

    The main objective of the Northwest Clean Energy Application Center (NW CEAC) is to promote and support implementation of clean energy technologies. These technologies include combined heat and power (CHP), district energy, waste heat recovery with a primary focus on waste heat to power, and other related clean energy systems such as stationary fuel cell CHP systems. The northwest states include AK, ID, MT, OR, and WA. The key aim/outcome of the Center is to promote and support implementation of clean energy projects. Implemented projects result in a number of benefits including increased energy efficiency, renewable energy development (when usingmore » opportunity fuels), reduced carbon emissions, improved facility economics helping to preserve jobs, and reduced criteria pollutants calculated on an output-based emissions basis. Specific objectives performed by the NW CEAC fall within the following five broad promotion and support categories: 1) Center management and planning including database support; 2) Education and Outreach including plan development, website, target market workshops, and education/outreach materials development 3) Identification and provision of screening assessments & feasibility studies as funded by the facility or occasionally further support of Potential High Impact Projects; 4) Project implementation assistance/trouble shooting; and 5) Development of a supportive clean energy policy and initiative/financing framework.« less

  17. Below room temperature: How the photocatalytic activity of dense and mesoporous TiO2 coatings is affected

    NASA Astrophysics Data System (ADS)

    Cedillo-González, Erika Iveth; Riccò, Raffaele; Costacurta, Stefano; Siligardi, Cristina; Falcaro, Paolo

    2018-03-01

    Different parameters such as morphology, porosity, crystalline phase or doping agents affect the self-cleaning performance of photocatalytic TiO2-based coatings. However, also environmental conditions have been found to play a major role on the photocatalytic self-cleaning property. Substrate temperature is a significant environmental variable that can drastically affect this process. This variable becomes of great importance especially for outdoor applications: many self-cleaning photocatalytic materials have been designed to be exposed to outdoor environments and consequently, can be exposed to variable temperatures depending on the season of the year and the typical weather of the geographical zone. Thus, understanding the influence of the most common outdoor temperatures on the self-cleaning performance of TiO2-based coatings is essential for the fabrication of any kind of photocatalytic self-cleaning materials (fabricated by coating technology) that is expected to be subjected to outdoor environments. In this work, the photocatalytic activity was studied by Fourier Transformed Infrared (FTIR) Spectroscopy varying the temperature in the 0 to 30 °C range for dense and mesoporous TiO2 coatings. The temperature conditions at which these coatings present better performances were identified, providing a deeper insight for the practical application of TiO2-based self-cleaning coatings.

  18. Collection and analysis of NASA clean room air samples

    NASA Technical Reports Server (NTRS)

    Sheldon, L. S.; Keever, J.

    1985-01-01

    The environment of the HALOE assembly clean room at NASA Langley Research Center is analyzed to determine the background levels of airborne organic compounds. Sampling is accomplished by pumping the clean room air through absorbing cartridges. For volatile organics, cartridges are thermally desorbed and then analyzed by gas chromatography and mass spectrometry, compounds are identified by searching the EPA/NIH data base using an interactive operator INCOS computer search algorithm. For semivolatile organics, cartridges are solvent entracted and concentrated extracts are analyzed by gas chromatography-electron capture detection, compound identification is made by matching gas chromatogram retention times with known standards. The detection limits for the semivolatile organics are; 0.89 ng cu m for dioctylphlhalate (DOP) and 1.6 ng cu m for polychlorinated biphenyls (PCB). The detection limit for volatile organics ranges from 1 to 50 parts per trillion. Only trace quantities of organics are detected, the DOP levels do not exceed 2.5 ng cu m and the PCB levels do not exceed 454 ng cu m.

  19. Photoacoustic tomography from weak and noisy signals by using a pulse decomposition algorithm in the time-domain.

    PubMed

    Liu, Liangbing; Tao, Chao; Liu, XiaoJun; Deng, Mingxi; Wang, Senhua; Liu, Jun

    2015-10-19

    Photoacoustic tomography is a promising and rapidly developed methodology of biomedical imaging. It confronts an increasing urgent problem to reconstruct the image from weak and noisy photoacoustic signals, owing to its high benefit in extending the imaging depth and decreasing the dose of laser exposure. Based on the time-domain characteristics of photoacoustic signals, a pulse decomposition algorithm is proposed to reconstruct a photoacoustic image from signals with low signal-to-noise ratio. In this method, a photoacoustic signal is decomposed as the weighted summation of a set of pulses in the time-domain. Images are reconstructed from the weight factors, which are directly related to the optical absorption coefficient. Both simulation and experiment are conducted to test the performance of the method. Numerical simulations show that when the signal-to-noise ratio is -4 dB, the proposed method decreases the reconstruction error to about 17%, in comparison with the conventional back-projection method. Moreover, it can produce acceptable images even when the signal-to-noise ratio is decreased to -10 dB. Experiments show that, when the laser influence level is low, the proposed method achieves a relatively clean image of a hair phantom with some well preserved pattern details. The proposed method demonstrates imaging potential of photoacoustic tomography in expanding applications.

  20. Modeling, investigation and formulation of hydrophobic coatings for potential self-cleaning applications

    NASA Astrophysics Data System (ADS)

    Rios, Pablo Fabian

    Self-cleaning surfaces have received a great deal of attention, both in research and commercial applications. Transparent and non-transparent self-cleaning surfaces are highly desired. The Lotus flower is a symbol of purity in Asian cultures, even when rising from muddy waters it stays clean and untouched by dirt. The Lotus leaf "self-cleaning" surface is hydrophobic and rough, showing a two-layer morphology. While hydrophobicity produces a high contact angle, surface morphology reduces the adhesion of dirt and water to the surface, thus water drops slide easily across the leaf carrying the dirt particles with them. Nature example in the Lotus-effect and extensive scientific research on related fields have rooted wide acceptance that high hydrophobicity can be obtained only by a proper combination of surface chemistry and roughness. Most researchers relate hydrophobicity to a high contact angle. However, the contact angle is not the only parameter that defines liquid-solid interactions. An additional parameter, the sliding angle, related to the adhesion between the liquid drop and the solid surface is also important in cases where liquid sliding is involved, such as self-cleaning applications. In this work, it is postulated that wetting which is related to the contact angle, and interfacial adhesion, which is related to the sliding angle, are interdependent phenomena and have to be considered simultaneously. A variety of models that relate the sliding angle to forces developed along the contact line between a liquid drop and a solid surface have been proposed in the literature. A new model is proposed here that quantifies the drop sliding phenomenon, based also on the interfacial adhesion across the contact area of the liquid/solid interface. The effects of roughness and chemical composition on the contact and sliding angles of hydrophobic smooth and rough surfaces were studied theoretically and experimentally. The validity of the proposed model was investigated and compared with the existing models. Ultra-hydrophobic non-transparent and transparent coatings for potential self-cleaning applications were produced using hydrophobic chemistry and different configurations of roughening micro and nano-particles, however they present low adhesion and durability. Durability and stability enhancement of such coatings was attempted and improved by different methods.

  1. LEAP: Looking beyond pixels with continuous-space EstimAtion of Point sources

    NASA Astrophysics Data System (ADS)

    Pan, Hanjie; Simeoni, Matthieu; Hurley, Paul; Blu, Thierry; Vetterli, Martin

    2017-12-01

    Context. Two main classes of imaging algorithms have emerged in radio interferometry: the CLEAN algorithm and its multiple variants, and compressed-sensing inspired methods. They are both discrete in nature, and estimate source locations and intensities on a regular grid. For the traditional CLEAN-based imaging pipeline, the resolution power of the tool is limited by the width of the synthesized beam, which is inversely proportional to the largest baseline. The finite rate of innovation (FRI) framework is a robust method to find the locations of point-sources in a continuum without grid imposition. The continuous formulation makes the FRI recovery performance only dependent on the number of measurements and the number of sources in the sky. FRI can theoretically find sources below the perceived tool resolution. To date, FRI had never been tested in the extreme conditions inherent to radio astronomy: weak signal / high noise, huge data sets, large numbers of sources. Aims: The aims were (i) to adapt FRI to radio astronomy, (ii) verify it can recover sources in radio astronomy conditions with more accurate positioning than CLEAN, and possibly resolve some sources that would otherwise be missed, (iii) show that sources can be found using less data than would otherwise be required to find them, and (iv) show that FRI does not lead to an augmented rate of false positives. Methods: We implemented a continuous domain sparse reconstruction algorithm in Python. The angular resolution performance of the new algorithm was assessed under simulation, and with visibility measurements from the LOFAR telescope. Existing catalogs were used to confirm the existence of sources. Results: We adapted the FRI framework to radio interferometry, and showed that it is possible to determine accurate off-grid point-source locations and their corresponding intensities. In addition, FRI-based sparse reconstruction required less integration time and smaller baselines to reach a comparable reconstruction quality compared to a conventional method. The achieved angular resolution is higher than the perceived instrument resolution, and very close sources can be reliably distinguished. The proposed approach has cubic complexity in the total number (typically around a few thousand) of uniform Fourier data of the sky image estimated from the reconstruction. It is also demonstrated that the method is robust to the presence of extended-sources, and that false-positives can be addressed by choosing an adequate model order to match the noise level.

  2. QRS Detection Algorithm for Telehealth Electrocardiogram Recordings.

    PubMed

    Khamis, Heba; Weiss, Robert; Xie, Yang; Chang, Chan-Wei; Lovell, Nigel H; Redmond, Stephen J

    2016-07-01

    QRS detection algorithms are needed to analyze electrocardiogram (ECG) recordings generated in telehealth environments. However, the numerous published QRS detectors focus on clean clinical data. Here, a "UNSW" QRS detection algorithm is described that is suitable for clinical ECG and also poorer quality telehealth ECG. The UNSW algorithm generates a feature signal containing information about ECG amplitude and derivative, which is filtered according to its frequency content and an adaptive threshold is applied. The algorithm was tested on clinical and telehealth ECG and the QRS detection performance is compared to the Pan-Tompkins (PT) and Gutiérrez-Rivas (GR) algorithm. For the MIT-BIH Arrhythmia database (virtually artifact free, clinical ECG), the overall sensitivity (Se) and positive predictivity (+P) of the UNSW algorithm was >99%, which was comparable to PT and GR. When applied to the MIT-BIH noise stress test database (clinical ECG with added calibrated noise) after artifact masking, all three algorithms had overall Se >99%, and the UNSW algorithm had higher +P (98%, p < 0.05) than PT and GR. For 250 telehealth ECG records (unsupervised recordings; dry metal electrodes), the UNSW algorithm had 98% Se and 95% +P which was superior to PT (+P: p < 0.001) and GR (Se and +P: p < 0.001). This is the first study to describe a QRS detection algorithm for telehealth data and evaluate it on clinical and telehealth ECG with superior results to published algorithms. The UNSW algorithm could be used to manage increasing telehealth ECG analysis workloads.

  3. Consent Decree D.G. Yuengling and Son, Inc. Clean Water Act Settlement

    EPA Pesticide Factsheets

    Yuengling owns and operates two beer breweries (the New Brewery and the Old Brewery) in Pottsville, Pennsylvania, which have been in significant non-compliance with the Clean Water Act because of persistent violations of application industrial wastewater.

  4. Precision Cleaning and Verification Processes Used at Marshall Space Flight Center for Critical Hardware Applications

    NASA Technical Reports Server (NTRS)

    Caruso, Salvadore V.; Cox, Jack A.; McGee, Kathleen A.

    1998-01-01

    Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration performs many research and development programs that require hardware and assemblies to be cleaned to levels that are compatible with fuels and oxidizers (liquid oxygen, solid propellants, etc.). Also, MSFC is responsible for developing large telescope satellites which require a variety of optical systems to be cleaned. A precision cleaning shop is operated within MSFC by the Fabrication Services Division of the Materials & Processes Laboratory. Verification of cleanliness is performed for all precision cleaned articles in the Environmental and Analytical Chemistry Branch. Since the Montreal Protocol was instituted, MSFC had to find substitutes for many materials that have been in use for many years, including cleaning agents and organic solvents. As MSFC is a research center, there is a great variety of hardware that is processed in the Precision Cleaning Shop. This entails the use of many different chemicals and solvents, depending on the nature and configuration of the hardware and softgoods being cleaned. A review of the manufacturing cleaning and verification processes, cleaning materials and solvents used at MSFC and changes that resulted from the Montreal Protocol will be presented.

  5. Precision Cleaning and Verification Processes Used at Marshall Space Flight Center for Critical Hardware Applications

    NASA Technical Reports Server (NTRS)

    Caruso, Salvadore V.

    1999-01-01

    Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) performs many research and development programs that require hardware and assemblies to be cleaned to levels that are compatible with fuels and oxidizers (liquid oxygen, solid propellants, etc.). Also, the Center is responsible for developing large telescope satellites which requires a variety of optical systems to be cleaned. A precision cleaning shop is operated with-in MSFC by the Fabrication Services Division of the Materials & Processes Division. Verification of cleanliness is performed for all precision cleaned articles in the Analytical Chemistry Branch. Since the Montreal Protocol was instituted, MSFC had to find substitutes for many materials that has been in use for many years, including cleaning agents and organic solvents. As MSFC is a research Center, there is a great variety of hardware that is processed in the Precision Cleaning Shop. This entails the use of many different chemicals and solvents, depending on the nature and configuration of the hardware and softgoods being cleaned. A review of the manufacturing cleaning and verification processes, cleaning materials and solvents used at MSFC and changes that resulted from the Montreal Protocol will be presented.

  6. Facile Dry Surface Cleaning of Graphene by UV Treatment

    NASA Astrophysics Data System (ADS)

    Kim, Jin Hong; Haidari, Mohd Musaib; Choi, Jin Sik; Kim, Hakseong; Yu, Young-Jun; Park, Jonghyurk

    2018-05-01

    Graphene has been considered an ideal material for application in transparent lightweight wearable electronics due to its extraordinary mechanical, optical, and electrical properties originating from its ordered hexagonal carbon atomic lattice in a layer. Precise surface control is critical in maximizing its performance in electronic applications. Graphene grown by chemical vapor deposition is widely used but it produces polymeric residue following wet/chemical transfer process, which strongly affects its intrinsic electrical properties and limits the doping efficiency by adsorption. Here, we introduce a facile dry-cleaning method based on UV irradiation to eliminate the organic residues even after device fabrication. Through surface topography, Raman analysis, and electrical transport measurement characteristics, we confirm that the optimized UV treatment can recover the clean graphene surface and improve graphene-FET performance more effectively than thermal treatment. We propose our UV irradiation method as a systematically controllable and damage-free post process for application in large-area devices.

  7. Screen and clean: a tool for identifying interactions in genome-wide association studies.

    PubMed

    Wu, Jing; Devlin, Bernie; Ringquist, Steven; Trucco, Massimo; Roeder, Kathryn

    2010-04-01

    Epistasis could be an important source of risk for disease. How interacting loci might be discovered is an open question for genome-wide association studies (GWAS). Most researchers limit their statistical analyses to testing individual pairwise interactions (i.e., marginal tests for association). A more effective means of identifying important predictors is to fit models that include many predictors simultaneously (i.e., higher-dimensional models). We explore a procedure called screen and clean (SC) for identifying liability loci, including interactions, by using the lasso procedure, which is a model selection tool for high-dimensional regression. We approach the problem by using a varying dictionary consisting of terms to include in the model. In the first step the lasso dictionary includes only main effects. The most promising single-nucleotide polymorphisms (SNPs) are identified using a screening procedure. Next the lasso dictionary is adjusted to include these main effects and the corresponding interaction terms. Again, promising terms are identified using lasso screening. Then significant terms are identified through the cleaning process. Implementation of SC for GWAS requires algorithms to explore the complex model space induced by the many SNPs genotyped and their interactions. We propose and explore a set of algorithms and find that SC successfully controls Type I error while yielding good power to identify risk loci and their interactions. When the method is applied to data obtained from the Wellcome Trust Case Control Consortium study of Type 1 Diabetes it uncovers evidence supporting interaction within the HLA class II region as well as within Chromosome 12q24.

  8. [The application of operating room quality backward system in instrument place management].

    PubMed

    Du, Hui; He, Anjie; Zeng, Leilei

    2010-09-01

    Improvement of the surgery instrument's clean quality, the optimized preparation way, reasonable arrangement in groups, raising the working efficiency. We use the quality backward system into the instrument clean, the pack and the preparation way's question, carry on the analysis and the optimization, and appraise the effect after trying out 6 months. After finally the way optimized, instrument clean quality distinct enhancement; The flaws in the instrument clean, the pack way and the total operating time reduce; the contradictory between nurses and the cleans arising from the unclear connection reduces, the satisfaction degree of nurse and doctor to the instrument enhances. Using of operating room quality backward system in the management of the instrument clean, the pack and the preparation way optimized, may reduce flaws in the work and the waste of human resources, raise the working efficiency.

  9. 40 CFR 471.91 - Effluent limitations representing the degree of effluent reduction attainable by the application...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 107 TSS 364 173 pH (1) (1) 1 Within the range of 7.5 to 10.0 at all times. (j) Alkaline cleaning spent... mg/off-kg (pounds per million off-pounds) of zirconium-hafnium alkaline cleaned Chromium 0.704 0.288... 31.2 pH (1) (1) 1 Within the range of 7.5 to 10.0 at all times. (k) Alkaline cleaning rinse. Subpart...

  10. Robust vector quantization for noisy channels

    NASA Technical Reports Server (NTRS)

    Demarca, J. R. B.; Farvardin, N.; Jayant, N. S.; Shoham, Y.

    1988-01-01

    The paper briefly discusses techniques for making vector quantizers more tolerant to tranmsission errors. Two algorithms are presented for obtaining an efficient binary word assignment to the vector quantizer codewords without increasing the transmission rate. It is shown that about 4.5 dB gain over random assignment can be achieved with these algorithms. It is also proposed to reduce the effects of error propagation in vector-predictive quantizers by appropriately constraining the response of the predictive loop. The constrained system is shown to have about 4 dB of SNR gain over an unconstrained system in a noisy channel, with a small loss of clean-channel performance.

  11. 40 CFR 85.525 - Applicable standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) CONTROL OF AIR POLLUTION FROM MOBILE SOURCES Exemption of Clean Alternative Fuel Conversions From... prohibition, vehicles/engines that have been converted to operate on a different fuel must meet emission... allowable grouping. Fleet average standards do not apply unless clean alternative fuel conversions are...

  12. Memorandum of Agreement: Exemptions Under Section 404(F) of the Clean Water Act

    EPA Pesticide Factsheets

    Memorandum of Agreement between the Department of the Army and the Environmental Protection Agency Concerning the Determination of the Section 404 Program and the Application of the Exemptions Under Section 404(F) of the Clean Water Act

  13. Guidelines for Cleaning Transvaginal Ultrasound Transducers Between Patients.

    PubMed

    Abramowicz, Jacques S; Evans, David H; Fowlkes, J Brian; Maršal, Karel; terHaar, Gail

    2017-05-01

    The purpose of this article is to provide guidance regarding the cleaning and disinfection of transvaginal ultrasound probes. These recommendations are also applicable to transrectal probes. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  14. The application of machine learning in multi sensor data fusion for activity recognition in mobile device space

    NASA Astrophysics Data System (ADS)

    Marhoubi, Asmaa H.; Saravi, Sara; Edirisinghe, Eran A.

    2015-05-01

    The present generation of mobile handheld devices comes equipped with a large number of sensors. The key sensors include the Ambient Light Sensor, Proximity Sensor, Gyroscope, Compass and the Accelerometer. Many mobile applications are driven based on the readings obtained from either one or two of these sensors. However the presence of multiple-sensors will enable the determination of more detailed activities that are carried out by the user of a mobile device, thus enabling smarter mobile applications to be developed that responds more appropriately to user behavior and device usage. In the proposed research we use recent advances in machine learning to fuse together the data obtained from all key sensors of a mobile device. We investigate the possible use of single and ensemble classifier based approaches to identify a mobile device's behavior in the space it is present. Feature selection algorithms are used to remove non-discriminant features that often lead to poor classifier performance. As the sensor readings are noisy and include a significant proportion of missing values and outliers, we use machine learning based approaches to clean the raw data obtained from the sensors, before use. Based on selected practical case studies, we demonstrate the ability to accurately recognize device behavior based on multi-sensor data fusion.

  15. Application of ant colony Algorithm and particle swarm optimization in architectural design

    NASA Astrophysics Data System (ADS)

    Song, Ziyi; Wu, Yunfa; Song, Jianhua

    2018-02-01

    By studying the development of ant colony algorithm and particle swarm algorithm, this paper expounds the core idea of the algorithm, explores the combination of algorithm and architectural design, sums up the application rules of intelligent algorithm in architectural design, and combines the characteristics of the two algorithms, obtains the research route and realization way of intelligent algorithm in architecture design. To establish algorithm rules to assist architectural design. Taking intelligent algorithm as the beginning of architectural design research, the authors provide the theory foundation of ant colony Algorithm and particle swarm algorithm in architectural design, popularize the application range of intelligent algorithm in architectural design, and provide a new idea for the architects.

  16. Stability of reference masses: VII. Cleaning methods in air and vacuum applied to a platinum mass standard similar to the international and national kilogram prototypes

    NASA Astrophysics Data System (ADS)

    Cumpson, Peter J.; Sano, Naoko; Barlow, Anders J.; Portoles, Jose F.

    2013-10-01

    Mercury contamination and the build-up of carbonaceous contamination are two contributing factors to the instability observed in kilogram prototype masses. The kilogram prototypes that lie at the core of the dissemination of the SI base unit were manufactured in the late 19th century, and have polished surfaces. In papers IV and V of this series we developed a method for cleaning noble metal mass standards in air to remove carbonaceous contamination. At the core of this ‘UVOPS’ protocol is the application of UV light and ozone gas generated in situ in air. The precise nature of the carbonaceous contamination that builds up on such surfaces is difficult to mimic demonstrably or quickly on new test surfaces, yet data from such tests are needed to provide the final confidence to allow UVOPS to be applied to a real 19th century kilogram prototype. Therefore, in the present work we have applied the UVOPS method to clean a platinum avoirdupois pound mass standard, ‘RS2’, manufactured in the mid-19th century. This is thought to have been polished in a similar manner to the kilogram prototypes. To our knowledge this platinum surface has not previously been cleaned by any method. We used x-ray photoelectron spectroscopy to identify organic contamination, and weighing to quantify the mass lost at each application of the UVOPS procedure. The UVOPS procedure is shown to be very effective. It is likely that the redefinition of the kilogram will require mass comparisons in vacuum in the years to come. Therefore, in addition to UVOPS a cleaning method for use in vacuum will also be needed. We introduce and evaluate gas cluster ion-beam (GCIB) treatment as a potential method for cleaning reference masses in vacuum. Again, application of this GCIB cleaning to a real artefact, RS2, allows us to make a realistic evaluation of its performance. While it has some attractive features, we cannot recommend it for cleaning mass standards in its present form.

  17. Tooth Surface Comparison after Air Polishing and Rubber Cup: A Scanning Electron Microscopy Study.

    PubMed

    Camboni, Sara; Donnet, Marcel

    2016-03-01

    To demonstrate, using microscopic observations, the difference between two well-known oral prophylaxis techniques: polishing paste and air polishing. The observations were performed on human enamel. Enamel samples were obtained from plaque-rich human teeth extracted for orthodontic or clinical purposes. In order to allow a reliable comparison between different applications, each enamel sample was divided into two parts: one underwent air-polishing, whereas polishing paste was applied to the other. AIR-FLOW® Master was selected together with AIR-FLOW® PLUS for the prophylaxis powder application. For the polishing-paste application, several different pastes where used, including Cleanic®, CCS®, Proxyt®, and SuperPolish. A comparative test control was also used by cleaning the enamel with sodium hypochlorite (6%). The enamel treated with AIR-FLOW PLUS showed a similar surface when compared to the control enamel; however, there was complete cleaning down to the tooth microstructure. On the other hand, use of the polishing paste resulted in an enamel surface that appeared abraded and flattened. Moreover, some of the natural irregular enamel surfaces demonstrated some filling in with debris. AIR-FLOW PLUS powder was able to more deeply clean without creating any damage to the enamel, making it suitable for regular cleaning treatments. The polishing pastes were found to abrade the enamel surface, to flatten it, and deposit debris into the microcavities. Both methods having different mechanical effects can therefore be considered as complementary, in that some patients experience a sense of "roughness" following a cleaning. A clinical recommendation for this experience would be to use the air polish first to clean the enamel surface, and follow with a little polishing paste to smooth the surface, if required.

  18. The Diagnosis of Urinary Tract infection in Young children (DUTY): a diagnostic prospective observational study to derive and validate a clinical algorithm for the diagnosis of urinary tract infection in children presenting to primary care with an acute illness.

    PubMed Central

    Hay, Alastair D; Birnie, Kate; Busby, John; Delaney, Brendan; Downing, Harriet; Dudley, Jan; Durbaba, Stevo; Fletcher, Margaret; Harman, Kim; Hollingworth, William; Hood, Kerenza; Howe, Robin; Lawton, Michael; Lisles, Catherine; Little, Paul; MacGowan, Alasdair; O'Brien, Kathryn; Pickles, Timothy; Rumsby, Kate; Sterne, Jonathan Ac; Thomas-Jones, Emma; van der Voort, Judith; Waldron, Cherry-Ann; Whiting, Penny; Wootton, Mandy; Butler, Christopher C

    2016-01-01

    BACKGROUND It is not clear which young children presenting acutely unwell to primary care should be investigated for urinary tract infection (UTI) and whether or not dipstick testing should be used to inform antibiotic treatment. OBJECTIVES To develop algorithms to accurately identify pre-school children in whom urine should be obtained; assess whether or not dipstick urinalysis provides additional diagnostic information; and model algorithm cost-effectiveness. DESIGN Multicentre, prospective diagnostic cohort study. SETTING AND PARTICIPANTS Children < 5 years old presenting to primary care with an acute illness and/or new urinary symptoms. METHODS One hundred and seven clinical characteristics (index tests) were recorded from the child's past medical history, symptoms, physical examination signs and urine dipstick test. Prior to dipstick results clinician opinion of UTI likelihood ('clinical diagnosis') and urine sampling and treatment intentions ('clinical judgement') were recorded. All index tests were measured blind to the reference standard, defined as a pure or predominant uropathogen cultured at ≥ 10(5) colony-forming units (CFU)/ml in a single research laboratory. Urine was collected by clean catch (preferred) or nappy pad. Index tests were sequentially evaluated in two groups, stratified by urine collection method: parent-reported symptoms with clinician-reported signs, and urine dipstick results. Diagnostic accuracy was quantified using area under receiver operating characteristic curve (AUROC) with 95% confidence interval (CI) and bootstrap-validated AUROC, and compared with the 'clinician diagnosis' AUROC. Decision-analytic models were used to identify optimal urine sampling strategy compared with 'clinical judgement'. RESULTS A total of 7163 children were recruited, of whom 50% were female and 49% were < 2 years old. Culture results were available for 5017 (70%); 2740 children provided clean-catch samples, 94% of whom were ≥ 2 years old, with 2.2% meeting the UTI definition. Among these, 'clinical diagnosis' correctly identified 46.6% of positive cultures, with 94.7% specificity and an AUROC of 0.77 (95% CI 0.71 to 0.83). Four symptoms, three signs and three dipstick results were independently associated with UTI with an AUROC (95% CI; bootstrap-validated AUROC) of 0.89 (0.85 to 0.95; validated 0.88) for symptoms and signs, increasing to 0.93 (0.90 to 0.97; validated 0.90) with dipstick results. Nappy pad samples were provided from the other 2277 children, of whom 82% were < 2 years old and 1.3% met the UTI definition. 'Clinical diagnosis' correctly identified 13.3% positive cultures, with 98.5% specificity and an AUROC of 0.63 (95% CI 0.53 to 0.72). Four symptoms and two dipstick results were independently associated with UTI, with an AUROC of 0.81 (0.72 to 0.90; validated 0.78) for symptoms, increasing to 0.87 (0.80 to 0.94; validated 0.82) with the dipstick findings. A high specificity threshold for the clean-catch model was more accurate and less costly than, and as effective as, clinical judgement. The additional diagnostic utility of dipstick testing was offset by its costs. The cost-effectiveness of the nappy pad model was not clear-cut. CONCLUSIONS Clinicians should prioritise the use of clean-catch sampling as symptoms and signs can cost-effectively improve the identification of UTI in young children where clean catch is possible. Dipstick testing can improve targeting of antibiotic treatment, but at a higher cost than waiting for a laboratory result. Future research is needed to distinguish pathogens from contaminants, assess the impact of the clean-catch algorithm on patient outcomes, and the cost-effectiveness of presumptive versus dipstick versus laboratory-guided antibiotic treatment. FUNDING The National Institute for Health Research Health Technology Assessment programme. PMID:27401902

  19. The Diagnosis of Urinary Tract infection in Young children (DUTY): a diagnostic prospective observational study to derive and validate a clinical algorithm for the diagnosis of urinary tract infection in children presenting to primary care with an acute illness.

    PubMed

    Hay, Alastair D; Birnie, Kate; Busby, John; Delaney, Brendan; Downing, Harriet; Dudley, Jan; Durbaba, Stevo; Fletcher, Margaret; Harman, Kim; Hollingworth, William; Hood, Kerenza; Howe, Robin; Lawton, Michael; Lisles, Catherine; Little, Paul; MacGowan, Alasdair; O'Brien, Kathryn; Pickles, Timothy; Rumsby, Kate; Sterne, Jonathan Ac; Thomas-Jones, Emma; van der Voort, Judith; Waldron, Cherry-Ann; Whiting, Penny; Wootton, Mandy; Butler, Christopher C

    2016-07-01

    It is not clear which young children presenting acutely unwell to primary care should be investigated for urinary tract infection (UTI) and whether or not dipstick testing should be used to inform antibiotic treatment. To develop algorithms to accurately identify pre-school children in whom urine should be obtained; assess whether or not dipstick urinalysis provides additional diagnostic information; and model algorithm cost-effectiveness. Multicentre, prospective diagnostic cohort study. Children < 5 years old presenting to primary care with an acute illness and/or new urinary symptoms. One hundred and seven clinical characteristics (index tests) were recorded from the child's past medical history, symptoms, physical examination signs and urine dipstick test. Prior to dipstick results clinician opinion of UTI likelihood ('clinical diagnosis') and urine sampling and treatment intentions ('clinical judgement') were recorded. All index tests were measured blind to the reference standard, defined as a pure or predominant uropathogen cultured at ≥ 10(5) colony-forming units (CFU)/ml in a single research laboratory. Urine was collected by clean catch (preferred) or nappy pad. Index tests were sequentially evaluated in two groups, stratified by urine collection method: parent-reported symptoms with clinician-reported signs, and urine dipstick results. Diagnostic accuracy was quantified using area under receiver operating characteristic curve (AUROC) with 95% confidence interval (CI) and bootstrap-validated AUROC, and compared with the 'clinician diagnosis' AUROC. Decision-analytic models were used to identify optimal urine sampling strategy compared with 'clinical judgement'. A total of 7163 children were recruited, of whom 50% were female and 49% were < 2 years old. Culture results were available for 5017 (70%); 2740 children provided clean-catch samples, 94% of whom were ≥ 2 years old, with 2.2% meeting the UTI definition. Among these, 'clinical diagnosis' correctly identified 46.6% of positive cultures, with 94.7% specificity and an AUROC of 0.77 (95% CI 0.71 to 0.83). Four symptoms, three signs and three dipstick results were independently associated with UTI with an AUROC (95% CI; bootstrap-validated AUROC) of 0.89 (0.85 to 0.95; validated 0.88) for symptoms and signs, increasing to 0.93 (0.90 to 0.97; validated 0.90) with dipstick results. Nappy pad samples were provided from the other 2277 children, of whom 82% were < 2 years old and 1.3% met the UTI definition. 'Clinical diagnosis' correctly identified 13.3% positive cultures, with 98.5% specificity and an AUROC of 0.63 (95% CI 0.53 to 0.72). Four symptoms and two dipstick results were independently associated with UTI, with an AUROC of 0.81 (0.72 to 0.90; validated 0.78) for symptoms, increasing to 0.87 (0.80 to 0.94; validated 0.82) with the dipstick findings. A high specificity threshold for the clean-catch model was more accurate and less costly than, and as effective as, clinical judgement. The additional diagnostic utility of dipstick testing was offset by its costs. The cost-effectiveness of the nappy pad model was not clear-cut. Clinicians should prioritise the use of clean-catch sampling as symptoms and signs can cost-effectively improve the identification of UTI in young children where clean catch is possible. Dipstick testing can improve targeting of antibiotic treatment, but at a higher cost than waiting for a laboratory result. Future research is needed to distinguish pathogens from contaminants, assess the impact of the clean-catch algorithm on patient outcomes, and the cost-effectiveness of presumptive versus dipstick versus laboratory-guided antibiotic treatment. The National Institute for Health Research Health Technology Assessment programme.

  20. Algorithm improvement program nuclide identification algorithm scoring criteria and scoring application.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enghauser, Michael

    2016-02-01

    The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.

  1. 40 CFR 442.40 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSPORTATION EQUIPMENT CLEANING POINT SOURCE CATEGORY Tanks Transporting Food Grade Cargos § 442.40... containers, rail tank cars, tank barges and ocean/sea tankers which have been used to transport food grade cargos. If wastewater generated from cleaning tanks used to transport food grade cargos is mixed with...

  2. Test results of flight guidance for fuel conservative descents in a time-based metered air traffic environment. [terminal configured vehicle

    NASA Technical Reports Server (NTRS)

    Knox, C. E.; Person, L. H., Jr.

    1981-01-01

    The NASA developed, implemented, and flight tested a flight management algorithm designed to improve the accuracy of delivering an airplane in a fuel-conservative manner to a metering fix at a time designated by air traffic control. This algorithm provides a 3D path with time control (4D) for the TCV B-737 airplane to make an idle-thrust, clean configured (landing gear up, flaps zero, and speed brakes retracted) descent to arrive at the metering fix at a predetermined time, altitude, and airspeed. The descent path is calculated for a constant Mach/airspeed schedule from linear approximations of airplane performance with considerations given for gross weight, wind, and nonstandard pressure and temperature effects. The flight management descent algorithms are described and flight test results are presented.

  3. High level waste tank closure project: ALARA applications at the Idaho National Engineering and Environmental Laboratory.

    PubMed

    Aitken, Steven B; Butler, Richard; Butterworth, Steven W; Quigley, Keith D

    2005-05-01

    Bechtel BWXT Idaho, Maintenance and Operating Contractor for the Department of Energy at the Idaho National Engineering and Environmental Laboratory, has emptied, cleaned, and sampled six of the eleven 1.135 x 10(6) L high level waste underground storage tanks at the Idaho Nuclear Technology and Engineering Center, well ahead of the State of Idaho Consent Order cleaning schedule. Cleaning of a seventh tank is expected to be complete by the end of calendar year 2004. The tanks, with associated vaults, valve boxes, and distribution systems, are being closed to meet Resource Conservation and Recovery Act regulations and Department of Energy orders. The use of remotely operated equipment placed in the tanks through existing tank riser access points, sampling methods and application of as-low-as-reasonably-achievable (ALARA) principles have proven effective in keeping personnel dose low during equipment removal, tank, vault, and valve box cleaning, and sampling activities, currently at 0.03 Sv.

  4. Removal of graffiti paintings from the Mansion de Mattis site in Corato (Bari), Italy: Laser deveiling or complete cleaning?

    NASA Astrophysics Data System (ADS)

    Daurelio, G.; Andriani, E. S.; Albanese, A.; Catalano, I. M.; Teseo, G.; Marano, D.

    2008-10-01

    Nowadays one the main problem of stone monuments conservation is not only the natural environment deterioration but the defaced, in particular esthetic, due to graffiti. This paper presents the different stages of the cleaning graffiti research: the laboratory study phase, in which the aims were to investigate the laser cleaning effect on substrate and testing user-friendly and efficient solutions for in situ application; the application phase in which the study results were applied in the restoration of Palazzo de Mattis facade. The graffiti cleaning were carried out by using a Q-Switch Nd:YAG laser source (λ=1064 nm with pulse duration, t=8 ns, f=2 to 20 Hz, energy per impulse up to 280 mJ) in dry, wet and Very wet modes adopting the Daurelio technique n.1 (blade spot laser). The Q-Switch Nd:Yag laser source has demonstrated to be the most suitable for a fully or, according to new restoring theory, "de veiling" graffiti ablation.

  5. Three-dimensional biocompatible matrix for reconstructive surgery

    NASA Astrophysics Data System (ADS)

    Reshetov, I. V.; Starceva, O. I.; Istranov, A. L.; Vorona, B. N.; Lyundup, A. V.; Gulyaev, I. V.; Melnikov, D. V.; Shtansky, D. V.; Sheveyko, A. N.; Andreev, V. A.

    2016-08-01

    A study into the development of an original bioengineered structure for reconstruction of hollow organs is presented. The basis for the structure was the creation of a mesh matrix made from titanium nickelide (NiTi), which has sufficient elasticity and shape memory for the reconstruction of hollow tubular orgrans. In order to increase the cell adhesion on the surface of the matrix, the grid needed to be cleaned of impurities, for which we used an ionic cleaning method. Additional advantages also may enable the application of the bioactive component to grid surface. These features of the matrix may improve the biocompatibility properties of the composite material. In the first stage, a mesh structure was made from NiTi fibers. The properties of the resulting mesh matrix were studied. In the second stage, the degrees of adhesion and cell growth rates in the untreated matrix, the matrix after ionic cleaning and the matrix after ionic cleaning and the application of the bioactive component were compared. The results showed more significant biocompatibility of the titanium nickelide matrix after its ionic cleaning. The ionic cleaning ensures the removal of toxic contaminants, which are a consequence of the technological production process of the material and provide optimal adhesion properties for the fiber surface. The NiTi net matrix with TiCaPCON coating may be the optimal basis for making the hollow elastic organs.

  6. Cleaning Robot for Solar Panels in Solar Power Station

    NASA Astrophysics Data System (ADS)

    Hang, Lu-Bin; Shen, Cheng-Wei; Bian, Huai-Qiang; Wang, Yan

    2016-05-01

    The dust particles on solar panel surface have been a serious problem for the photovoltaic industry, a new monorail-tracked robot used for automatic cleaning of solar panel is presented in this paper. To meet the requirement of comprehensive and stable cleaning of PV array, the monorail-tracked pattern of robot is introduced based on the monorail structure technique. The running and striding mechanism are designed for mobility of robot on the solar panels. According to the carrying capacity and water circulation mechanism, a type of self-cleaning device with filtering system is developed. Combined with the computer software and communications technology, the control system is built in this robot, which can realize the functions of autonomous operation, positioning and monitoring. The application of this developed cleaning robot can actualize the Industrialization of automatic cleaning for PV components and have wide market prospect.

  7. An efficient robust sound classification algorithm for hearing aids.

    PubMed

    Nordqvist, Peter; Leijon, Arne

    2004-06-01

    An efficient robust sound classification algorithm based on hidden Markov models is presented. The system would enable a hearing aid to automatically change its behavior for differing listening environments according to the user's preferences. This work attempts to distinguish between three listening environment categories: speech in traffic noise, speech in babble, and clean speech, regardless of the signal-to-noise ratio. The classifier uses only the modulation characteristics of the signal. The classifier ignores the absolute sound pressure level and the absolute spectrum shape, resulting in an algorithm that is robust against irrelevant acoustic variations. The measured classification hit rate was 96.7%-99.5% when the classifier was tested with sounds representing one of the three environment categories included in the classifier. False-alarm rates were 0.2%-1.7% in these tests. The algorithm is robust and efficient and consumes a small amount of instructions and memory. It is fully possible to implement the classifier in a DSP-based hearing instrument.

  8. Text extraction via an edge-bounded averaging and a parametric character model

    NASA Astrophysics Data System (ADS)

    Fan, Jian

    2003-01-01

    We present a deterministic text extraction algorithm that relies on three basic assumptions: color/luminance uniformity of the interior region, closed boundaries of sharp edges and the consistency of local contrast. The algorithm is basically independent of the character alphabet, text layout, font size and orientation. The heart of this algorithm is an edge-bounded averaging for the classification of smooth regions that enhances robustness against noise without sacrificing boundary accuracy. We have also developed a verification process to clean up the residue of incoherent segmentation. Our framework provides a symmetric treatment for both regular and inverse text. We have proposed three heuristics for identifying the type of text from a cluster consisting of two types of pixel aggregates. Finally, we have demonstrated the advantages of the proposed algorithm over adaptive thresholding and block-based clustering methods in terms of boundary accuracy, segmentation coherency, and capability to identify inverse text and separate characters from background patches.

  9. A Note on Evolutionary Algorithms and Its Applications

    ERIC Educational Resources Information Center

    Bhargava, Shifali

    2013-01-01

    This paper introduces evolutionary algorithms with its applications in multi-objective optimization. Here elitist and non-elitist multiobjective evolutionary algorithms are discussed with their advantages and disadvantages. We also discuss constrained multiobjective evolutionary algorithms and their applications in various areas.

  10. Manual cleaning of hospital mattresses: an observational study comparing high- and low-resource settings.

    PubMed

    Hopman, J; Hakizimana, B; Meintjes, W A J; Nillessen, M; de Both, E; Voss, A; Mehtar, S

    2016-01-01

    Hospital-associated infections (HAIs) are more frequently encountered in low- than in high-resource settings. There is a need to identify and implement feasible and sustainable approaches to strengthen HAI prevention in low-resource settings. To evaluate the biological contamination of routinely cleaned mattresses in both high- and low-resource settings. In this two-stage observational study, routine manual bed cleaning was evaluated at two university hospitals using adenosine triphosphate (ATP). Standardized training of cleaning personnel was achieved in both high- and low-resource settings. Qualitative analysis of the cleaning process was performed to identify predictors of cleaning outcome in low-resource settings. Mattresses in low-resource settings were highly contaminated prior to cleaning. Cleaning significantly reduced biological contamination of mattresses in low-resource settings (P < 0.0001). After training, the contamination observed after cleaning in both the high- and low-resource settings seemed comparable. Cleaning with appropriate type of cleaning materials reduced the contamination of mattresses adequately. Predictors for mattresses that remained contaminated in a low-resource setting included: type of product used, type of ward, training, and the level of contamination prior to cleaning. In low-resource settings mattresses were highly contaminated as noted by ATP levels. Routine manual cleaning by trained staff can be as effective in a low-resource setting as in a high-resource setting. We recommend a multi-modal cleaning strategy that consists of training of domestic services staff, availability of adequate time to clean beds between patients, and application of the correct type of cleaning products. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  11. Efficacy of a hospital-wide environmental cleaning protocol on hospital-acquired methicillin-resistant Staphylococcus aureus rates.

    PubMed

    Watson, Paul Andrew; Watson, Luke Robert; Torress-Cook, Alfonso

    2016-07-01

    Environmental contamination has been associated with over half of methicillin-resistant Staphylococcus aureus (MRSA) outbreaks in hospitals. We explored if a hospital-wide environmental and patient cleaning protocol would lower hospital acquired MRSA rates and associated costs. This study evaluates the impact of implementing a hospital-wide environmental and patient cleaning protocol on the rate of MRSA infection and the potential cost benefit of the intervention. A retrospective, pre-post interventional study design was used. The intervention comprised a combination of enhanced environmental cleaning of high touch surfaces, daily washing of patients with benzalkonium chloride, and targeted isolation of patients with active infection. The rate of MRSA infection per 1000 patient days (PD) was compared with the rate after the intervention (Steiros Algorithm ® ) was implemented. A cost-benefit analysis based on the number of MRSA infections avoided was conducted. The MRSA rates decreased by 96% from 3.04 per 1000 PD to 0.11 per 1000 PD ( P <0.0001). This reduction in MRSA infections, avoided an estimated $1,655,143 in healthcare costs. Implementation of this hospital-wide protocol appears to be associated with a reduction in the rate of MRSA infection and therefore a reduction in associated healthcare costs.

  12. 40 CFR 91.603 - Applicability of part 91, subpart F.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... violations of the Clean Air Act and the regulations thereunder. (Authorized Company Representative.) (9.... 91.603 Section 91.603 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... representative of the manufacturer: This report is submitted pursuant to Sections 213 and 208 of the Clean Air...

  13. 40 CFR 91.603 - Applicability of part 91, subpart F.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... violations of the Clean Air Act and the regulations thereunder. (Authorized Company Representative.) (9.... 91.603 Section 91.603 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... representative of the manufacturer: This report is submitted pursuant to Sections 213 and 208 of the Clean Air...

  14. 40 CFR 91.603 - Applicability of part 91, subpart F.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... violations of the Clean Air Act and the regulations thereunder. (Authorized Company Representative.) (9.... 91.603 Section 91.603 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... representative of the manufacturer: This report is submitted pursuant to Sections 213 and 208 of the Clean Air...

  15. 40 CFR 91.603 - Applicability of part 91, subpart F.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... violations of the Clean Air Act and the regulations thereunder. (Authorized Company Representative.) (9....603 Section 91.603 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... representative of the manufacturer: This report is submitted pursuant to Sections 213 and 208 of the Clean Air...

  16. CLEANING EXCAVATED SOIL USING EXTRACTION AGENTS: A STATE-OF-THE-ART REVIEW

    EPA Science Inventory

    This report presents a state-of-the-art review of soil washing technologies and their applicability to Superfund sites in the United States. The review includes Superfund site soil and contamination characteristics; as well as soil cleaning technologies, their principles of opera...

  17. THE USE OF GEOMORPHOLOGY IN THE ASSESSMENT OF STREAM STABILITY

    EPA Science Inventory

    Various applications of geomorphic data and stream stability rating systems are being considered in order to establish tools for the development of TMDLs for clean sediment in streams. The transport of "clean" sediment, as opposed to contaminated sediment, is of concern to the en...

  18. Application of CO2 Snow Jet Cleaning in Conjunction with Laboratory Based Total Reflection X-Ray Fluorescence

    NASA Technical Reports Server (NTRS)

    Schmeling, M.; Burnett, D. S.; Allton, J. H.; Rodriquez, M.; Tripa, C. E.; Veryovkin, I. V.

    2013-01-01

    The Genesis mission was the first mission returning solar material to Earth since the Apollo program [1,2]. Unfortunately the return of the space craft on September 8, 2004 resulted in a crash landing, which shattered the samples into small fragments and exposed them to desert soil and other debris. Thus only small fragments of the original collectors are available, each having different degrees of surface contamination. Thorough surface cleaning is required to allow for subsequent analysis of solar wind material embedded within. An initial cleaning procedure was developed in coordination with Johnson Space Center which focused on removing larger sized particulates and a thin film organic contamination acquired during collection in space [3]. However, many of the samples have additional residues and more rigorous and/or innovative cleaning steps might be necessary. These cleaning steps must affect only the surface to avoid leaching and re-distribution of solar wind material from the bulk of the collectors. To aid in development and identification of the most appropriate cleaning procedures each sample has to be thoroughly inspected before and after each cleaning step. Laboratory based total reflection X-ray fluorescence (TXRF) spectrometry lends itself to this task as it is a non-destructive and surface sensitive analytical method permitting analysis of elements from aluminum onward present at and near the surface of a flat substrate [4]. The suitability of TXRF has been demonstrated for several Genesis solar wind samples before and after various cleaning methods including acid treatment, gas cluster ion beam, and CO2 snow jet [5 - 7]. The latter one is non-invasive and did show some promise on one sample [5]. To investigate the feasibility of CO2 snow jet cleaning further, several flown Genesis samples were selected to be characterized before and after CO2 snow application with sample 61052 being discussed below.

  19. Algorithm Improvement Program Nuclide Identification Algorithm Scoring Criteria And Scoring Application - DNDO.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Enghauser, Michael

    2015-02-01

    The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.

  20. Identification of sewer pipes to be cleaned for reduction of CSO pollutant load.

    PubMed

    Nagaiwa, Akihiro; Settsu, Katsushi; Nakajima, Fumiyuki; Furumai, Hiroaki

    2007-01-01

    To reduce the CSO (Combined Sewer Overflow) pollutant discharge, one of the effective options is cleaning of sewer pipes before rainfall events. To maximize the efficiency, identification of pipes to be cleaned is necessary. In this study, we discussed the location of pipe deposit in dry weather in a combined sewer system using a distributed model and investigated the effect of pipe cleaning to reduce the pollutant load from the CSO. First we simulated the dry weather flow in a combined sewer system. The pipe deposit distribution in the network was estimated after 3 days of dry weather period. Several specific pipes with structural defect and upper end pipes tend to have an accumulation of deposit. Wet weather simulations were conducted with and without pipe cleaning in rainfall events with different patterns. The SS loads in CSO with and without the pipe cleaning were compared. The difference in the estimated loads was interpreted as the contribution of wash-off in the cleaned pipe. The effect of pipe cleaning on reduction of the CSO pollutant load was quantitatively evaluated (e.g. the cleaning of one specific pipe could reduce 22% of total CSO load). The CSO simulations containing pipe cleaning options revealed that identification of pipes with accumulated deposit using the distributed model is very useful and informative to evaluate the applicability of pipe cleaning option for CSO pollutant reduction.

  1. A New Dusts Sensor for Cultural Heritage Applications Based on Image Processing

    PubMed Central

    Proietti, Andrea; Leccese, Fabio; Caciotta, Maurizio; Morresi, Fabio; Santamaria, Ulderico; Malomo, Carmela

    2014-01-01

    In this paper, we propose a new sensor for the detection and analysis of dusts (seen as powders and fibers) in indoor environments, especially designed for applications in the field of Cultural Heritage or in other contexts where the presence of dust requires special care (surgery, clean rooms, etc.). The presented system relies on image processing techniques (enhancement, noise reduction, segmentation, metrics analysis) and it allows obtaining both qualitative and quantitative information on the accumulation of dust. This information aims to identify the geometric and topological features of the elements of the deposit. The curators can use this information in order to design suitable prevention and maintenance actions for objects and environments. The sensor consists of simple and relatively cheap tools, based on a high-resolution image acquisition system, a preprocessing software to improve the captured image and an analysis algorithm for the feature extraction and the classification of the elements of the dust deposit. We carried out some tests in order to validate the system operation. These tests were performed within the Sistine Chapel in the Vatican Museums, showing the good performance of the proposed sensor in terms of execution time and classification accuracy. PMID:24901977

  2. Clean birth kits to improve birth practices: development and testing of a country level decision support tool.

    PubMed

    Hundley, Vanora A; Avan, Bilal I; Ahmed, Haris; Graham, Wendy J

    2012-12-19

    Clean birth practices can prevent sepsis, one of the leading causes of both maternal and newborn mortality. Evidence suggests that clean birth kits (CBKs), as part of package that includes education, are associated with a reduction in newborn mortality, omphalitis, and puerperal sepsis. However, questions remain about how best to approach the introduction of CBKs in country. We set out to develop a practical decision support tool for programme managers of public health systems who are considering the potential role of CBKs in their strategy for care at birth. Development and testing of the decision support tool was a three-stage process involving an international expert group and country level testing. Stage 1, the development of the tool was undertaken by the Birth Kit Working Group and involved a review of the evidence, a consensus meeting, drafting of the proposed tool and expert review. In Stage 2 the tool was tested with users through interviews (9) and a focus group, with federal and provincial level decision makers in Pakistan. In Stage 3 the findings from the country level testing were reviewed by the expert group. The decision support tool comprised three separate algorithms to guide the policy maker or programme manager through the specific steps required in making the country level decision about whether to use CBKs. The algorithms were supported by a series of questions (that could be administered by interview, focus group or questionnaire) to help the decision maker identify the information needed. The country level testing revealed that the decision support tool was easy to follow and helpful in making decisions about the potential role of CBKs. Minor modifications were made and the final algorithms are presented. Testing of the tool with users in Pakistan suggests that the tool facilitates discussion and aids decision making. However, testing in other countries is needed to determine whether these results can be replicated and to identify how the tool can be adapted to meet country specific needs.

  3. Automated Array Assembly Task In-depth Study of Silicon Wafer Surface Texturizing

    NASA Technical Reports Server (NTRS)

    Jones, G. T.; Chitre, S.; Rhee, S. S.; Allison, K. L.

    1979-01-01

    A low cost wafer surface texturizing process was studied. An investigation of low cost cleaning operations to clean residual wax and organics from the surface of silicon wafers was made. The feasibility of replacing dry nitrogen with clean dry air for drying silicon wafers was examined. The two stage texturizing process was studied for the purpose of characterizing relevant parameters in large volume applications. The effect of gettering solar cells on photovoltaic energy conversion efficiency is described.

  4. 40 CFR 25.1 - Introduction.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... RESOURCE CONSERVATION AND RECOVERY ACT, THE SAFE DRINKING WATER ACT, AND THE CLEAN WATER ACT § 25.1... in activities under the Clean Water Act (Pub. L. 95-217), the Resource Conservation and Recovery Act (Pub. L. 94-580), and the Safe Drinking Water Act (Pub. L. 93-523). The applicability of the...

  5. Report: New Hampshire Clean Water State Revolving Fund Program Financial Statements with Independent Auditor’s Report, June 30, 2002

    EPA Pesticide Factsheets

    Report #2003-1-00086, March 26, 2003. The audit contains reports on the financial statements, internal controls, and compliance requirements applicable to the Clean Water State Revolving Fund program in New Hampshire for the year ended June 30, 2002.

  6. 40 CFR 63.803 - Work practice standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... containers for storing finishing, gluing, cleaning, and washoff materials. (h) Application equipment... from a container that has a volume of no more than 2.0 gallons. (3) When spray is automated, that is... shall pump or drain all organic HAP solvent used for line cleaning into a normally closed container. (j...

  7. Report: Eleven Years After Agreement, EPA Has Not Developed Reliable Emission Estimation Methods to Determine Whether Animal Feeding Operations Comply With Clean Air Act and Other Statutes

    EPA Pesticide Factsheets

    Report #17-P-0396, September 19, 2017. Until the EPA develops sound methods to estimate emissions, the agency cannot reliably determine whether animal feeding operations comply with applicable Clean Air Act requirements.

  8. 77 FR 3836 - Supplemental Environmental Impact Statement, Mingo County, WV

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... Mountain Surface Mine Clean Water Act Section 404 Permit Application. The FHWA and USACE are joint-lead... 2000 FEIS and approved in the 2000 ROD. The USACE is evaluating a Clean Water Act (CWA) Section 404... to surface water and groundwater resources, including aquatic habitat, water quantity and quality...

  9. Analysis of estimation algorithms for CDTI and CAS applications

    NASA Technical Reports Server (NTRS)

    Goka, T.

    1985-01-01

    Estimation algorithms for Cockpit Display of Traffic Information (CDTI) and Collision Avoidance System (CAS) applications were analyzed and/or developed. The algorithms are based on actual or projected operational and performance characteristics of an Enhanced TCAS II traffic sensor developed by Bendix and the Federal Aviation Administration. Three algorithm areas are examined and discussed. These are horizontal x and y, range and altitude estimation algorithms. Raw estimation errors are quantified using Monte Carlo simulations developed for each application; the raw errors are then used to infer impacts on the CDTI and CAS applications. Applications of smoothing algorithms to CDTI problems are also discussed briefly. Technical conclusions are summarized based on the analysis of simulation results.

  10. Robust Superhydrophobic Graphene-Based Composite Coatings with Self-Cleaning and Corrosion Barrier Properties.

    PubMed

    Nine, Md J; Cole, Martin A; Johnson, Lucas; Tran, Diana N H; Losic, Dusan

    2015-12-30

    Superhydrophobic surfaces for self-cleaning applications often suffer from mechanical instability and do not function well after abrasion/scratching. To address this problem, we present a method to prepare graphene-based superhydrophobic composite coatings with robust mechanical strength, self-cleaning, and barrier properties. A suspension has been formulated that contains a mixture of reduced graphene oxide (rGO) and diatomaceous earth (DE) modified with polydimethylsiloxane (PDMS) that can be applied on any surface using common coating methods such as spraying, brush painting, and dip coating. Inclusion of TiO2 nanoparticles to the formulation shows further increase in water contact angle (WCA) from 159 ± 2° to 170 ± 2° due to the structural improvement with hierarchical surface roughness. Mechanical stability and durability of the coatings has been achieved by using a commercial adhesive to bond the superhydrophobic "paint" to various substrates. Excellent retention of superhydrophobicity was observed even after sandpaper abrasion and crosscut scratching. A potentiodynamic polarization study revealed excellent corrosion resistance (96.78%) properties, and an acid was used to provide further insight into coating barrier properties. The ease of application and remarkable properties of this graphene-based composite coating show considerable potential for broad application as a self-cleaning and protective layer.

  11. Operator dermal exposure and protection provided by personal protective equipment and working coveralls during mixing/loading, application and sprayer cleaning in vineyards.

    PubMed

    Thouvenin, Isabelle; Bouneb, Françoise; Mercier, Thierry

    2017-06-01

    The efficiency of a working coverall combined with personal protective equipment to protect operators against dermal exposure to plant protection products under field conditions was studied. Operators wore a non-certified water-repellent finish polyester/cotton coverall plus a certified gown during the mixing/loading and the cleaning phases. Insecticide foliar application to a vineyard was selected as the exposure scenario. The overall dermal residue levels measured in this study were in the range of data recently collected in Europe. The water-repellent finish working coverall reduced body exposure by a factor of approximately 95%. Wearing a Category III Type 3 partial body gown during mixing/loading and cleaning of the application equipment led to a further protective effect of 98.7%. The combination of a water-repellent finish working coverall and partial body protection during specific tasks provided satisfactory levels of protection and can be considered as suitable protection for the conditions of use studied.

  12. Document image cleanup and binarization

    NASA Astrophysics Data System (ADS)

    Wu, Victor; Manmatha, Raghaven

    1998-04-01

    Image binarization is a difficult task for documents with text over textured or shaded backgrounds, poor contrast, and/or considerable noise. Current optical character recognition (OCR) and document analysis technology do not handle such documents well. We have developed a simple yet effective algorithm for document image clean-up and binarization. The algorithm consists of two basic steps. In the first step, the input image is smoothed using a low-pass filter. The smoothing operation enhances the text relative to any background texture. This is because background texture normally has higher frequency than text does. The smoothing operation also removes speckle noise. In the second step, the intensity histogram of the smoothed image is computed and a threshold automatically selected as follows. For black text, the first peak of the histogram corresponds to text. Thresholding the image at the value of the valley between the first and second peaks of the histogram binarizes the image well. In order to reliably identify the valley, the histogram is smoothed by a low-pass filter before the threshold is computed. The algorithm has been applied to some 50 images from a wide variety of source: digitized video frames, photos, newspapers, advertisements in magazines or sales flyers, personal checks, etc. There are 21820 characters and 4406 words in these images. 91 percent of the characters and 86 percent of the words are successfully cleaned up and binarized. A commercial OCR was applied to the binarized text when it consisted of fonts which were OCR recognizable. The recognition rate was 84 percent for the characters and 77 percent for the words.

  13. An image-guided tool to prevent hospital acquired infections

    NASA Astrophysics Data System (ADS)

    Nagy, Melinda; Szilágyi, László; Lehotsky, Ákos; Haidegger, Tamás; Benyó, Balázs

    2011-03-01

    Hospital Acquired Infections (HAI) represent the fourth leading cause of death in the United States, and claims hundreds of thousands of lives annually in the rest of the world. This paper presents a novel low-cost mobile device|called Stery-Hand|that helps to avoid HAI by improving hand hygiene control through providing an objective evaluation of the quality of hand washing. The use of the system is intuitive: having performed hand washing with a soap mixed with UV re ective powder, the skin appears brighter in UV illumination on the disinfected surfaces. Washed hands are inserted into the Stery-Hand box, where a digital image is taken under UV lighting. Automated image processing algorithms are employed in three steps to evaluate the quality of hand washing. First, the contour of the hand is extracted in order to distinguish the hand from the background. Next, a semi-supervised clustering algorithm classies the pixels of the hand into three groups, corresponding to clean, partially clean and dirty areas. The clustering algorithm is derived from the histogram-based quick fuzzy c-means approach, using a priori information extracted from reference images, evaluated by experts. Finally, the identied areas are adjusted to suppress shading eects, and quantied in order to give a verdict on hand disinfection quality. The proposed methodology was validated through tests using hundreds of images recorded in our laboratory. The proposed system was found robust and accurate, producing correct estimation for over 98% of the test cases. Stery-Hand may be employed in general practice, and it may also serve educational purposes.

  14. Enhancements to the caliop aerosol subtyping and lidar ratio selection algorithms for level II version 4

    NASA Astrophysics Data System (ADS)

    Omar, A.; Tackett, J.; Kim, M.-H.; Vaughan, M.; Kar, J.; Trepte, C.; Winker, D.

    2018-04-01

    Several enhancements have been implemented for the version 4 aerosol subtyping and lidar ratio selection algorithms of Cloud Aerosol Lidar with Orthogonal Polarization (CALIOP). Version 4 eliminates the confusion between smoke and clean marine aerosols seen in version 3 by modifications to the elevated layer flag definitions used to identify smoke aerosols over the ocean. To differentiate between mixtures of dust and smoke, and dust and marine aerosols, a new aerosol type will be added in the version 4 data products. In the marine boundary layer, moderately depolarizing aerosols are no longer modeled as mixtures of dust and smoke (polluted dust) but rather as mixtures of dust and seasalt (dusty marine). Some lidar ratios have been updated in the version 4 algorithms. In particular, the dust lidar ratios have been adjusted to reflect the latest measurements and model studies.

  15. Successive approximation algorithm for beam-position-monitor-based LHC collimator alignment

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Nosych, Andriy A.; Bruce, Roderik; Gasior, Marek; Mirarchi, Daniele; Redaelli, Stefano; Salvachua, Belen; Wollmann, Daniel

    2014-02-01

    Collimators with embedded beam position monitor (BPM) button electrodes will be installed in the Large Hadron Collider (LHC) during the current long shutdown period. For the subsequent operation, BPMs will allow the collimator jaws to be kept centered around the beam orbit. In this manner, a better beam cleaning efficiency and machine protection can be provided at unprecedented higher beam energies and intensities. A collimator alignment algorithm is proposed to center the jaws automatically around the beam. The algorithm is based on successive approximation and takes into account a correction of the nonlinear BPM sensitivity to beam displacement and an asymmetry of the electronic channels processing the BPM electrode signals. A software implementation was tested with a prototype collimator in the Super Proton Synchrotron. This paper presents results of the tests along with some considerations for eventual operation in the LHC.

  16. The Search for Nonflammable Solvent Alternatives for Cleaning Aerospace Oxygen Systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Mark; Lowrey, Nikki

    2012-01-01

    Oxygen systems are susceptible to fires caused by particle and nonvolatile residue (NVR) contaminants, therefore cleaning and verification is essential for system safety. . Cleaning solvents used on oxygen system components must be either nonflammable in pure oxygen or complete removal must be assured for system safety. . CFC -113 was the solvent of choice before 1996 because it was effective, least toxic, compatible with most materials of construction, and non ]reactive with oxygen. When CFC -113 was phased out in 1996, HCFC -225 was selected as an interim replacement for cleaning propulsion oxygen systems at NASA. HCFC-225 production phase-out date is 01/01/2015. HCFC ]225 (AK ]225G) is used extensively at Marshall Space Flight Center and Stennis Space Center for cleaning and NVR verification on large propulsion oxygen systems, and propulsion test stands and ground support equipment. . Many components are too large for ultrasonic agitation - necessary for effective aqueous cleaning and NVR sampling. . Test stand equipment must be cleaned prior to installation of test hardware. Many items must be cleaned by wipe or flush in situ where complete removal of a flammable solvent cannot be assured. The search for a replacement solvent for these applications is ongoing.

  17. Preparation of Mica and Silicon Substrates for DNA Origami Analysis and Experimentation

    PubMed Central

    Pillers, Michelle A.; Shute, Rebecca; Farchone, Adam; Linder, Keenan P.; Doerfler, Rose; Gavin, Corey; Goss, Valerie; Lieberman, Marya

    2015-01-01

    The designed nature and controlled, one-pot synthesis of DNA origami provides exciting opportunities in many fields, particularly nanoelectronics. Many of these applications require interaction with and adhesion of DNA nanostructures to a substrate. Due to its atomically flat and easily cleaned nature, mica has been the substrate of choice for DNA origami experiments. However, the practical applications of mica are relatively limited compared to those of semiconductor substrates. For this reason, a straightforward, stable, and repeatable process for DNA origami adhesion on derivatized silicon oxide is presented here. To promote the adhesion of DNA nanostructures to silicon oxide surface, a self-assembled monolayer of 3-aminopropyltriethoxysilane (APTES) is deposited from an aqueous solution that is compatible with many photoresists. The substrate must be cleaned of all organic and metal contaminants using Radio Corporation of America (RCA) cleaning processes and the native oxide layer must be etched to ensure a flat, functionalizable surface. Cleanrooms are equipped with facilities for silicon cleaning, however many components of DNA origami buffers and solutions are often not allowed in them due to contamination concerns. This manuscript describes the set-up and protocol for in-lab, small-scale silicon cleaning for researchers who do not have access to a cleanroom or would like to incorporate processes that could cause contamination of a cleanroom CMOS clean bench. Additionally, variables for regulating coverage are discussed and how to recognize and avoid common sample preparation problems is described. PMID:26274888

  18. Cleaning of titanium substrates after application in a bioreactor.

    PubMed

    Fingerle, Mathias; Köhler, Oliver; Rösch, Christina; Kratz, Fabian; Scheibe, Christian; Davoudi, Neda; Müller-Renno, Christine; Ziegler, Christiane; Huster, Manuel; Schlegel, Christin; Ulber, Roland; Bohley, Martin; Aurich, Jan C

    2015-03-10

    Plain and microstructured cp-titanium samples were studied as possible biofilm reactor substrates. The biofilms were grown by exposition of the titanium samples to bacteria in a flow cell. As bacteria the rod shaped gram negative Pseudomonas fluorescens and the spherical gram negative Paracoccus seriniphilus were chosen. Afterward, the samples were cleaned in subsequent steps: First, with a standard solvent based cleaning procedure with acetone, isopropanol, and ultrapure water and second by oxygen plasma sputtering. It will be demonstrated by means of x-ray photoelectron spectroscopy, fluorescence microscopy, and confocal laser scanning microscopy that oxygen plasma cleaning is a necessary and reliant tool to fully clean and restore titanium surfaces contaminated with a biofilm. The microstructured surfaces act beneficial to biofilm growth, while still being fully restorable after biofilm contamination. Scanning electron microscopy images additionally show, that the plasma process does not affect the microstructures. The presented data show the importance of the cleaning procedure. Just using solvents does not remove the biofilm and all its components reliably while a cleaning process by oxygen plasma regenerates the surfaces.

  19. Hardware cleanliness methodology and certification

    NASA Technical Reports Server (NTRS)

    Harvey, Gale A.; Lash, Thomas J.; Rawls, J. Richard

    1995-01-01

    Inadequacy of mass loss cleanliness criteria for selection of materials for contamination sensitive uses, and processing of flight hardware for contamination sensitive instruments is discussed. Materials selection for flight hardware is usually based on mass loss (ASTM E-595). However, flight hardware cleanliness (MIL 1246A) is a surface cleanliness assessment. It is possible for materials (e.g. Sil-Pad 2000) to pass ASTM E-595 and fail MIL 1246A class A by orders of magnitude. Conversely, it is possible for small amounts of nonconforming material (Huma-Seal conformal coating) to not present significant cleanliness problems to an optical flight instrument. Effective cleaning (precleaning, precision cleaning, and ultra cleaning) and cleanliness verification are essential for contamination sensitive flight instruments. Polish cleaning of hardware, e.g. vacuum baking for vacuum applications, and storage of clean hardware, e.g. laser optics, is discussed. Silicone materials present special concerns for use in space because of the rapid conversion of the outgassed residues to glass by solar ultraviolet radiation and/or atomic oxygen. Non ozone depleting solvent cleaning and institutional support for cleaning and certification are also discussed.

  20. Cleaning without chlorinated solvents

    NASA Technical Reports Server (NTRS)

    Thompson, L. M.; Simandl, R. F.

    1995-01-01

    Because of health and environmental concerns, many regulations have been passed in recent years regarding the use of chlorinated solvents. The Oak Ridge Y-12 Plant has had an active program to find alternatives for these solvents used in cleaning applications for the past 7 years. During this time frame, the quantity of solvents purchased has been reduced by 92 percent. The program has been a twofold effort. Vapor degreasers used in batch cleaning operations have been replaced by ultrasonic cleaning with aqueous detergent, and other organic solvents have been identified for use in hand-wiping or specialty operations. In order to qualify these alternatives for use, experimentation was conducted on cleaning ability as well as effects on subsequent operations such as welding, painting, and bonding. Cleaning ability was determined using techniques such as x-ray photoelectron spectroscopy (XPS) and Fourier transform infrared spectroscopy (FTIR) which are capable of examining monolayer levels of contamination on a surface. Solvents have been identified for removal of rust preventative oils, lapping oils, machining coolants, lubricants, greases, and mold releases. Solvents have also been evaluated for cleaning urethane foam spray guns, swelling of urethanes, and swelling of epoxies.

  1. Data pre-processing in record linkage to find the same companies from different databases

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Lubis, M. S.; Arisandi, D.; Azzahry, B.

    2018-03-01

    As public agencies, the Badan Pelayanan Perizinan Terpadu (BPPT) and the Badan Lingkungan Hidup (BLH) of Medan city manage process to obtain a business license from the public. However, each agency might have a different corporate data because of a separate data input process, even though the data may refer to the same company’s data. Therefore, it is required to identify and correlate data that refer to the same company which lie in different data sources. This research focuses on data pre-processing such as data cleaning, text pre-processing, indexing and record comparison. In addition, this research implements data matching using support vector machine algorithm. The result of this algorithm will be used to record linkage of data that can be used to identify and connect the company’s data based on the degree of similarity of each data. Previous data will be standardized in accordance with the format and structure appropriate to the stage of preprocessing data. After analyzing data pre-processing, we found that both database structures are not designed to support data integration. We decide that the data matching can be done with blocking criteria such as company name and the name of the owner (or applicant). In addition to data pre-processing, the result of data classification with a high level of similarity as many as 90 pairs of records.

  2. YM500v2: a small RNA sequencing (smRNA-seq) database for human cancer miRNome research

    PubMed Central

    Cheng, Wei-Chung; Chung, I-Fang; Tsai, Cheng-Fong; Huang, Tse-Shun; Chen, Chen-Yang; Wang, Shao-Chuan; Chang, Ting-Yu; Sun, Hsing-Jen; Chao, Jeffrey Yung-Chuan; Cheng, Cheng-Chung; Wu, Cheng-Wen; Wang, Hsei-Wei

    2015-01-01

    We previously presented YM500, which is an integrated database for miRNA quantification, isomiR identification, arm switching discovery and novel miRNA prediction from 468 human smRNA-seq datasets. Here in this updated YM500v2 database (http://ngs.ym.edu.tw/ym500/), we focus on the cancer miRNome to make the database more disease-orientated. New miRNA-related algorithms developed after YM500 were included in YM500v2, and, more significantly, more than 8000 cancer-related smRNA-seq datasets (including those of primary tumors, paired normal tissues, PBMC, recurrent tumors, and metastatic tumors) were incorporated into YM500v2. Novel miRNAs (miRNAs not included in the miRBase R21) were not only predicted by three independent algorithms but also cleaned by a new in silico filtration strategy and validated by wetlab data such as Cross-Linked ImmunoPrecipitation sequencing (CLIP-seq) to reduce the false-positive rate. A new function ‘Meta-analysis’ is additionally provided for allowing users to identify real-time differentially expressed miRNAs and arm-switching events according to customer-defined sample groups and dozens of clinical criteria tidying up by proficient clinicians. Cancer miRNAs identified hold the potential for both basic research and biotech applications. PMID:25398902

  3. CGBayesNets: Conditional Gaussian Bayesian Network Learning and Inference with Mixed Discrete and Continuous Data

    PubMed Central

    Weiss, Scott T.

    2014-01-01

    Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com. PMID:24922310

  4. CGBayesNets: conditional Gaussian Bayesian network learning and inference with mixed discrete and continuous data.

    PubMed

    McGeachie, Michael J; Chang, Hsun-Hsien; Weiss, Scott T

    2014-06-01

    Bayesian Networks (BN) have been a popular predictive modeling formalism in bioinformatics, but their application in modern genomics has been slowed by an inability to cleanly handle domains with mixed discrete and continuous variables. Existing free BN software packages either discretize continuous variables, which can lead to information loss, or do not include inference routines, which makes prediction with the BN impossible. We present CGBayesNets, a BN package focused around prediction of a clinical phenotype from mixed discrete and continuous variables, which fills these gaps. CGBayesNets implements Bayesian likelihood and inference algorithms for the conditional Gaussian Bayesian network (CGBNs) formalism, one appropriate for predicting an outcome of interest from, e.g., multimodal genomic data. We provide four different network learning algorithms, each making a different tradeoff between computational cost and network likelihood. CGBayesNets provides a full suite of functions for model exploration and verification, including cross validation, bootstrapping, and AUC manipulation. We highlight several results obtained previously with CGBayesNets, including predictive models of wood properties from tree genomics, leukemia subtype classification from mixed genomic data, and robust prediction of intensive care unit mortality outcomes from metabolomic profiles. We also provide detailed example analysis on public metabolomic and gene expression datasets. CGBayesNets is implemented in MATLAB and available as MATLAB source code, under an Open Source license and anonymous download at http://www.cgbayesnets.com.

  5. Phenology Data Products to Support Assessment and Forecasting of Phenology on Multiple Spatiotemporal Scales

    NASA Astrophysics Data System (ADS)

    Gerst, K.; Enquist, C.; Rosemartin, A.; Denny, E. G.; Marsh, L.; Moore, D. J.; Weltzin, J. F.

    2014-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database maintained by USA-NPN now has over 3.7 million records for plants and animals for the period 1954-2014, with the majority of these observations collected since 2008 as part of a broad, national contributory science strategy. These data have been used in a number of science, conservation and resource management applications, including national assessments of historical and potential future trends in phenology, regional assessments of spatio-temporal variation in organismal activity, and local monitoring for invasive species detection. Customizable data downloads are freely available, and data are accompanied by FGDC-compliant metadata, data-use and data-attribution policies, vetted and documented methodologies and protocols, and version control. While users are free to develop custom algorithms for data cleaning, winnowing and summarization prior to analysis, the National Coordinating Office of USA-NPN is developing a suite of standard data products to facilitate use and application by a diverse set of data users. This presentation provides a progress report on data product development, including: (1) Quality controlled raw phenophase status data; (2) Derived phenometrics (e.g. onset, duration) at multiple scales; (3) Data visualization tools; (4) Tools to support assessment of species interactions and overlap; (5) Species responsiveness to environmental drivers; (6) Spatially gridded phenoclimatological products; and (7) Algorithms for modeling and forecasting future phenological responses. The prioritization of these data products is a direct response to stakeholder needs related to informing management and policy decisions. We anticipate that these products will contribute to broad understanding of plant and animal phenology across scientific disciplines.

  6. Automated wavelet denoising of photoacoustic signals for circulating melanoma cell detection and burn image reconstruction.

    PubMed

    Holan, Scott H; Viator, John A

    2008-06-21

    Photoacoustic image reconstruction may involve hundreds of point measurements, each of which contributes unique information about the subsurface absorbing structures under study. For backprojection imaging, two or more point measurements of photoacoustic waves induced by irradiating a biological sample with laser light are used to produce an image of the acoustic source. Each of these measurements must undergo some signal processing, such as denoising or system deconvolution. In order to process the numerous signals, we have developed an automated wavelet algorithm for denoising signals. We appeal to the discrete wavelet transform for denoising photoacoustic signals generated in a dilute melanoma cell suspension and in thermally coagulated blood. We used 5, 9, 45 and 270 melanoma cells in the laser beam path as test concentrations. For the burn phantom, we used coagulated blood in 1.6 mm silicon tube submerged in Intralipid. Although these two targets were chosen as typical applications for photoacoustic detection and imaging, they are of independent interest. The denoising employs level-independent universal thresholding. In order to accommodate nonradix-2 signals, we considered a maximal overlap discrete wavelet transform (MODWT). For the lower melanoma cell concentrations, as the signal-to-noise ratio approached 1, denoising allowed better peak finding. For coagulated blood, the signals were denoised to yield a clean photoacoustic resulting in an improvement of 22% in the reconstructed image. The entire signal processing technique was automated so that minimal user intervention was needed to reconstruct the images. Such an algorithm may be used for image reconstruction and signal extraction for applications such as burn depth imaging, depth profiling of vascular lesions in skin and the detection of single cancer cells in blood samples.

  7. Improving Pattern Recognition and Neural Network Algorithms with Applications to Solar Panel Energy Optimization

    NASA Astrophysics Data System (ADS)

    Zamora Ramos, Ernesto

    Artificial Intelligence is a big part of automation and with today's technological advances, artificial intelligence has taken great strides towards positioning itself as the technology of the future to control, enhance and perfect automation. Computer vision includes pattern recognition and classification and machine learning. Computer vision is at the core of decision making and it is a vast and fruitful branch of artificial intelligence. In this work, we expose novel algorithms and techniques built upon existing technologies to improve pattern recognition and neural network training, initially motivated by a multidisciplinary effort to build a robot that helps maintain and optimize solar panel energy production. Our contributions detail an improved non-linear pre-processing technique to enhance poorly illuminated images based on modifications to the standard histogram equalization for an image. While the original motivation was to improve nocturnal navigation, the results have applications in surveillance, search and rescue, medical imaging enhancing, and many others. We created a vision system for precise camera distance positioning motivated to correctly locate the robot for capture of solar panel images for classification. The classification algorithm marks solar panels as clean or dirty for later processing. Our algorithm extends past image classification and, based on historical and experimental data, it identifies the optimal moment in which to perform maintenance on marked solar panels as to minimize the energy and profit loss. In order to improve upon the classification algorithm, we delved into feedforward neural networks because of their recent advancements, proven universal approximation and classification capabilities, and excellent recognition rates. We explore state-of-the-art neural network training techniques offering pointers and insights, culminating on the implementation of a complete library with support for modern deep learning architectures, multilayer percepterons and convolutional neural networks. Our research with neural networks has encountered a great deal of difficulties regarding hyperparameter estimation for good training convergence rate and accuracy. Most hyperparameters, including architecture, learning rate, regularization, trainable parameters (or weights) initialization, and so on, are chosen via a trial and error process with some educated guesses. However, we developed the first quantitative method to compare weight initialization strategies, a critical hyperparameter choice during training, to estimate among a group of candidate strategies which would make the network converge to the highest classification accuracy faster with high probability. Our method provides a quick, objective measure to compare initialization strategies to select the best possible among them beforehand without having to complete multiple training sessions for each candidate strategy to compare final results.

  8. The Acquisition and Transfer of Knowledge of Electrokinetic-Hydrodynamics (EKHD) Fundamentals: an Introductory Graduate-Level Course

    ERIC Educational Resources Information Center

    Pascal, Jennifer; Tíjaro-Rojas, Rocío; Oyanader, Mario A.; Arce, Pedro E.

    2017-01-01

    Relevant engineering applications, such as bioseparation of proteins and DNA, soil-cleaning, motion of colloidal particles in different media, electrical field-based cancer treatments, and the cleaning of surfaces and coating flows, belongs to the family of "Applied Field Sensitive Process Technologies" requiring an external field to…

  9. Thumbnail Sketches: EDTA-Type Chelating Agents in Everyday Consumer Products: Some Food, Cleaning, and Photographic Applications.

    ERIC Educational Resources Information Center

    Hart, J. Roger

    1985-01-01

    Discusses the role of chelating agents in (1) mayonnaise and salad dressings; (2) canned legumes; (3) plant foods; (4) liquid dishwashing detergents; (5) toilet soaps; (6) floor wax removers; (7) hard surface cleaners; (8) carpet cleaning; (9) bathtub and tile cleaners; and (10) photography. (JN)

  10. 76 FR 62061 - Clean Water Act Section 303(d): Availability of List Decisions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-06

    ... ENVIRONMENTAL PROTECTION AGENCY [FRL-9475-4] Clean Water Act Section 303(d): Availability of List... three waterbodies. These three waterbodies were added by EPA because the applicable numeric water... be obtained at EPA Region 6's Web site at http://www.epa.gov/region6/water/npdes/tmdl/index.htm...

  11. Air pollution control systems in WtE units: An overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vehlow, J., E-mail: juergen.vehlow@partner.kit.edu

    Highlights: • The paper describes in brief terms the development of gas cleaning in waste incineration. • The main technologies for pollutant removal are described including their basic mechanisms. • Their respective efficiencies and their application are discussed. • A cautious outlook regarding future developments is made. - Abstract: All WtE (waste-to-energy) plants, based on combustion or other thermal processes, need an efficient gas cleaning for compliance with legislative air emission standards. The development of gas cleaning technologies started along with environment protection regulations in the late 1960s. Modern APC (air pollution control) systems comprise multiple stages for the removalmore » of fly ashes, inorganic and organic gases, heavy metals, and dioxins from the flue gas. The main technologies and devices used for abatement of the various pollutants are described and their basic principles, their peculiarities, and their application are discussed. Few systems for cleaning of synthesis gas from waste gasification plants are included. Examples of APC designs in full scale plants are shown and cautious prospects for the future development of APC systems are made.« less

  12. TiO2-SiO2 Coatings with a Low Content of AuNPs for Producing Self-Cleaning Building Materials

    PubMed Central

    Gil, M. L. Almoraima; Mosquera, María J.

    2018-01-01

    The high pollution levels in our cities are producing a significant increase of dust on buildings. An application of photoactive coatings on building materials can produce buildings with self-cleaning surfaces. In this study, we have developed a simple sol-gel route for producing Au-TiO2/SiO2 photocatalysts with application on buildings. The gold nanoparticles (AuNPs) improved the TiO2 photoactivity under solar radiation because they promoted absorption in the visible range. We varied the content of AuNPs in the sols under study, in order to investigate their effect on self-cleaning properties. The sols obtained were sprayed on a common building stone, producing coatings which adhere firmly to the stone and preserve their aesthetic qualities. We studied the decolourization efficiency of the photocatalysts under study against methylene blue and against soot (a real staining agent for buildings). Finally, we established that the coating with an intermediate Au content presented the best self-cleaning performance, due to the role played by its structure and texture on its photoactivity. PMID:29558437

  13. Nanostructured Gd3+-TiO2 surfaces for self-cleaning application

    NASA Astrophysics Data System (ADS)

    Saif, M.; El-Molla, S. A.; Aboul-Fotouh, S. M. K.; Ibrahim, M. M.; Ismail, L. F. M.; Dahn, Douglas C.

    2014-06-01

    Preparation of self-cleaning surfaces based on lanthanide modified titanium dioxide nanoparticles has rarely been reported. In the present work, gadolinium doped titanium dioxide thin films (x mol Gd3+-TiO2 where x = 0.000, 0.005, 0.008, 0.010, 0.020 and 0.030 mol) were synthesized by sol-gel method and deposited using doctor-blade method. These films were characterized by studying their structural, optical and electrical properties. Doping with gadolinium decreases the band gap energy and increase conductivity of thin films. The photo self-cleaning activity in term of quantitative determination of the active oxidative species (rad OH) produced on the thin film surfaces was evaluated using fluorescent probe method. The results show that, the highly active thin film is the 0.020 Gd3+-TiO2. The structural, morphology, optical, electrical and photoactivity properties of Gd3+-TiO2 thin films make it promising surfaces for self-cleaning application. Mineralization of commercial textile dye (Remazol Red RB-133, RR) and durability using 0.020Gd3+-TiO2 film surface was studied.

  14. Cleaning with Bulk Nanobubbles.

    PubMed

    Zhu, Jie; An, Hongjie; Alheshibri, Muidh; Liu, Lvdan; Terpstra, Paul M J; Liu, Guangming; Craig, Vincent S J

    2016-11-01

    The electrolysis of aqueous solutions produces solutions that are supersaturated in oxygen and hydrogen gas. This results in the formation of gas bubbles, including nanobubbles ∼100 nm in size that are stable for ∼24 h. These aqueous solutions containing bubbles have been evaluated for cleaning efficacy in the removal of model contaminants bovine serum albumin and lysozyme from surfaces and in the prevention of the fouling of surfaces by these same proteins. Hydrophilic and hydrophobic surfaces were investigated. It is shown that nanobubbles can prevent the fouling of surfaces and that they can also clean already fouled surfaces. It is also argued that in practical applications where cleaning is carried out rapidly using a high degree of mechanical agitation the role of cleaning agents is not primarily in assisting the removal of soil but in suspending the soil that is removed by mechanical action and preventing it from redepositing onto surfaces. This may also be the primary mode of action of nanobubbles during cleaning.

  15. Fluid drag reduction and efficient self-cleaning with rice leaf and butterfly wing bioinspired surfaces

    NASA Astrophysics Data System (ADS)

    Bixler, Gregory D.; Bhushan, Bharat

    2013-08-01

    Researchers are continually inspired by living nature to solve complex challenges. For example, unique surface characteristics of rice leaves and butterfly wings combine the shark skin (anisotropic flow leading to low drag) and lotus leaf (superhydrophobic and self-cleaning) effects, producing the so-called rice and butterfly wing effect. In this paper, we present an overview of rice leaf and butterfly wing fluid drag and self-cleaning studies. In addition, we examine two other promising aquatic surfaces in nature known for such properties, including fish scales and shark skin. Morphology, drag, self-cleaning, contact angle, and contact angle hysteresis data are presented to understand the role of wettability, viscosity, and velocity. Liquid repellent coatings are utilized to recreate or combine various effects. Discussion is provided along with conceptual models describing the role of surface structures related to low drag, self-cleaning, and antifouling properties. Modeling provides design guidance when developing novel low drag and self-cleaning surfaces for applications in the medical, marine, and industrial fields.

  16. Photoplethysmograph signal reconstruction based on a novel hybrid motion artifact detection-reduction approach. Part I: Motion and noise artifact detection.

    PubMed

    Chong, Jo Woon; Dao, Duy K; Salehizadeh, S M A; McManus, David D; Darling, Chad E; Chon, Ki H; Mendelson, Yitzhak

    2014-11-01

    Motion and noise artifacts (MNA) are a serious obstacle in utilizing photoplethysmogram (PPG) signals for real-time monitoring of vital signs. We present a MNA detection method which can provide a clean vs. corrupted decision on each successive PPG segment. For motion artifact detection, we compute four time-domain parameters: (1) standard deviation of peak-to-peak intervals (2) standard deviation of peak-to-peak amplitudes (3) standard deviation of systolic and diastolic interval ratios, and (4) mean standard deviation of pulse shape. We have adopted a support vector machine (SVM) which takes these parameters from clean and corrupted PPG signals and builds a decision boundary to classify them. We apply several distinct features of the PPG data to enhance classification performance. The algorithm we developed was verified on PPG data segments recorded by simulation, laboratory-controlled and walking/stair-climbing experiments, respectively, and we compared several well-established MNA detection methods to our proposed algorithm. All compared detection algorithms were evaluated in terms of motion artifact detection accuracy, heart rate (HR) error, and oxygen saturation (SpO2) error. For laboratory controlled finger, forehead recorded PPG data and daily-activity movement data, our proposed algorithm gives 94.4, 93.4, and 93.7% accuracies, respectively. Significant reductions in HR and SpO2 errors (2.3 bpm and 2.7%) were noted when the artifacts that were identified by SVM-MNA were removed from the original signal than without (17.3 bpm and 5.4%). The accuracy and error values of our proposed method were significantly higher and lower, respectively, than all other detection methods. Another advantage of our method is its ability to provide highly accurate onset and offset detection times of MNAs. This capability is important for an automated approach to signal reconstruction of only those data points that need to be reconstructed, which is the subject of the companion paper to this article. Finally, our MNA detection algorithm is real-time realizable as the computational speed on the 7-s PPG data segment was found to be only 7 ms with a Matlab code.

  17. Single-Sex Schools, Student Achievement, and Course Selection: Evidence from Rule-Based Student Assignments in Trinidad and Tobago. NBER Working Paper No. 16817

    ERIC Educational Resources Information Center

    Jackson, C. Kirabo

    2011-01-01

    Existing studies on single-sex schooling suffer from biases due to student selection to schools and single-sex schools being better in unmeasured ways. In Trinidad and Tobago students are assigned to secondary schools based on an algorithm allowing one to address self-selection bias and cleanly estimate an upper-bound single-sex school effect. The…

  18. Significant Change Spotting for Periodic Human Motion Segmentation of Cleaning Tasks Using Wearable Sensors

    PubMed Central

    Liu, Kai-Chun; Chan, Chia-Tai

    2017-01-01

    The proportion of the aging population is rapidly increasing around the world, which will cause stress on society and healthcare systems. In recent years, advances in technology have created new opportunities for automatic activities of daily living (ADL) monitoring to improve the quality of life and provide adequate medical service for the elderly. Such automatic ADL monitoring requires reliable ADL information on a fine-grained level, especially for the status of interaction between body gestures and the environment in the real-world. In this work, we propose a significant change spotting mechanism for periodic human motion segmentation during cleaning task performance. A novel approach is proposed based on the search for a significant change of gestures, which can manage critical technical issues in activity recognition, such as continuous data segmentation, individual variance, and category ambiguity. Three typical machine learning classification algorithms are utilized for the identification of the significant change candidate, including a Support Vector Machine (SVM), k-Nearest Neighbors (kNN), and Naive Bayesian (NB) algorithm. Overall, the proposed approach achieves 96.41% in the F1-score by using the SVM classifier. The results show that the proposed approach can fulfill the requirement of fine-grained human motion segmentation for automatic ADL monitoring. PMID:28106853

  19. Implementation of Paste Backfill Mining Technology in Chinese Coal Mines

    PubMed Central

    Chang, Qingliang; Zhou, Huaqiang; Bai, Jianbiao

    2014-01-01

    Implementation of clean mining technology at coal mines is crucial to protect the environment and maintain balance among energy resources, consumption, and ecology. After reviewing present coal clean mining technology, we introduce the technology principles and technological process of paste backfill mining in coal mines and discuss the components and features of backfill materials, the constitution of the backfill system, and the backfill process. Specific implementation of this technology and its application are analyzed for paste backfill mining in Daizhuang Coal Mine; a practical implementation shows that paste backfill mining can improve the safety and excavation rate of coal mining, which can effectively resolve surface subsidence problems caused by underground mining activities, by utilizing solid waste such as coal gangues as a resource. Therefore, paste backfill mining is an effective clean coal mining technology, which has widespread application. PMID:25258737

  20. Nd:YAG laser double wavelength ablation of pollution encrustation on marble and bonding glues on duplicated painting canvas

    NASA Astrophysics Data System (ADS)

    Batishche, Sergei; Englezis, Apostolis; Gorovets, Tatiana; Kouzmouk, Andrei; Pilipenka, Uladzimir; Pouli, Paraskevi; Tatur, Hennady; Totou, Garyfallia; Ukhau, Viktar

    2005-07-01

    In the present study, a newly developed one-beam IR-UV laser cleaning system is presented. This system may be used for different applications in diverse fields, such as outdoors stonework conservation and canvas paintings restoration. The simultaneous use of the fundamental radiation of a Q-switched Nd:YAG laser at 1064 nm and its third harmonic at 355 nm was found appropriate to clean pollution crusts, while ensuring that no discoloration ("yellowing") would occur. The optimum ratio of UV to IR wavelengths in the final cleaning beam was investigated. In parallel, the same system was tested in diverse applications, such as the removal of bonding glues from duplicated canvases. The optimum laser parameters were investigated both on technical samples as well as on original paintings.

  1. Implementation of paste backfill mining technology in Chinese coal mines.

    PubMed

    Chang, Qingliang; Chen, Jianhang; Zhou, Huaqiang; Bai, Jianbiao

    2014-01-01

    Implementation of clean mining technology at coal mines is crucial to protect the environment and maintain balance among energy resources, consumption, and ecology. After reviewing present coal clean mining technology, we introduce the technology principles and technological process of paste backfill mining in coal mines and discuss the components and features of backfill materials, the constitution of the backfill system, and the backfill process. Specific implementation of this technology and its application are analyzed for paste backfill mining in Daizhuang Coal Mine; a practical implementation shows that paste backfill mining can improve the safety and excavation rate of coal mining, which can effectively resolve surface subsidence problems caused by underground mining activities, by utilizing solid waste such as coal gangues as a resource. Therefore, paste backfill mining is an effective clean coal mining technology, which has widespread application.

  2. Model-Based Infrared Metrology for Advanced Technology Nodes and 300 mm Wafer Processing

    NASA Astrophysics Data System (ADS)

    Rosenthal, Peter A.; Duran, Carlos; Tower, Josh; Mazurenko, Alex; Mantz, Ulrich; Weidner, Peter; Kasic, Alexander

    2005-09-01

    The use of infrared spectroscopy for production semiconductor process monitoring has evolved recently from primarily unpatterned, i.e. blanket test wafer measurements in a limited historical application space of blanket epitaxial, BPSG, and FSG layers to new applications involving patterned product wafer measurements, and new measurement capabilities. Over the last several years, the semiconductor industry has adopted a new set of materials associated with copper/low-k interconnects, and new structures incorporating exotic materials including silicon germanium, SOI substrates and high aspect ratio trenches. The new device architectures and more chemically sophisticated materials have raised new process control and metrology challenges that are not addressed by current measurement technology. To address the challenges we have developed a new infrared metrology tool designed for emerging semiconductor production processes, in a package compatible with modern production and R&D environments. The tool incorporates recent advances in reflectance instrumentation including highly accurate signal processing, optimized reflectometry optics, and model-based calibration and analysis algorithms. To meet the production requirements of the modern automated fab, the measurement hardware has been integrated with a fully automated 300 mm platform incorporating front opening unified pod (FOUP) interfaces, automated pattern recognition and high throughput ultra clean robotics. The tool employs a suite of automated dispersion-model analysis algorithms capable of extracting a variety of layer properties from measured spectra. The new tool provides excellent measurement precision, tool matching, and a platform for deploying many new production and development applications. In this paper we will explore the use of model based infrared analysis as a tool for characterizing novel bottle capacitor structures employed in high density dynamic random access memory (DRAM) chips. We will explore the capability of the tool for characterizing multiple geometric parameters associated with the manufacturing process that are important to the yield and performance of advanced bottle DRAM devices.

  3. A moving mesh unstaggered constrained transport scheme for magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Mocz, Philip; Pakmor, Rüdiger; Springel, Volker; Vogelsberger, Mark; Marinacci, Federico; Hernquist, Lars

    2016-11-01

    We present a constrained transport (CT) algorithm for solving the 3D ideal magnetohydrodynamic (MHD) equations on a moving mesh, which maintains the divergence-free condition on the magnetic field to machine-precision. Our CT scheme uses an unstructured representation of the magnetic vector potential, making the numerical method simple and computationally efficient. The scheme is implemented in the moving mesh code AREPO. We demonstrate the performance of the approach with simulations of driven MHD turbulence, a magnetized disc galaxy, and a cosmological volume with primordial magnetic field. We compare the outcomes of these experiments to those obtained with a previously implemented Powell divergence-cleaning scheme. While CT and the Powell technique yield similar results in idealized test problems, some differences are seen in situations more representative of astrophysical flows. In the turbulence simulations, the Powell cleaning scheme artificially grows the mean magnetic field, while CT maintains this conserved quantity of ideal MHD. In the disc simulation, CT gives slower magnetic field growth rate and saturates to equipartition between the turbulent kinetic energy and magnetic energy, whereas Powell cleaning produces a dynamically dominant magnetic field. Such difference has been observed in adaptive-mesh refinement codes with CT and smoothed-particle hydrodynamics codes with divergence-cleaning. In the cosmological simulation, both approaches give similar magnetic amplification, but Powell exhibits more cell-level noise. CT methods in general are more accurate than divergence-cleaning techniques, and, when coupled to a moving mesh can exploit the advantages of automatic spatial/temporal adaptivity and reduced advection errors, allowing for improved astrophysical MHD simulations.

  4. A constrained-gradient method to control divergence errors in numerical MHD

    NASA Astrophysics Data System (ADS)

    Hopkins, Philip F.

    2016-10-01

    In numerical magnetohydrodynamics (MHD), a major challenge is maintaining nabla \\cdot {B}=0. Constrained transport (CT) schemes achieve this but have been restricted to specific methods. For more general (meshless, moving-mesh, ALE) methods, `divergence-cleaning' schemes reduce the nabla \\cdot {B} errors; however they can still be significant and can lead to systematic errors which converge away slowly. We propose a new constrained gradient (CG) scheme which augments these with a projection step, and can be applied to any numerical scheme with a reconstruction. This iteratively approximates the least-squares minimizing, globally divergence-free reconstruction of the fluid. Unlike `locally divergence free' methods, this actually minimizes the numerically unstable nabla \\cdot {B} terms, without affecting the convergence order of the method. We implement this in the mesh-free code GIZMO and compare various test problems. Compared to cleaning schemes, our CG method reduces the maximum nabla \\cdot {B} errors by ˜1-3 orders of magnitude (˜2-5 dex below typical errors if no nabla \\cdot {B} cleaning is used). By preventing large nabla \\cdot {B} at discontinuities, this eliminates systematic errors at jumps. Our CG results are comparable to CT methods; for practical purposes, the nabla \\cdot {B} errors are eliminated. The cost is modest, ˜30 per cent of the hydro algorithm, and the CG correction can be implemented in a range of numerical MHD methods. While for many problems, we find Dedner-type cleaning schemes are sufficient for good results, we identify a range of problems where using only Powell or `8-wave' cleaning can produce order-of-magnitude errors.

  5. Development of a solar-powered electric bicycle in bike sharing transportation system

    NASA Astrophysics Data System (ADS)

    Adhisuwignjo, S.; Siradjuddin, I.; Rifa'i, M.; Putri, R. I.

    2017-06-01

    The increasing mobility has directly led to deteriorating traffic conditions, extra fuel consumption, increasing automobile exhaust emissions, air pollution and lowering quality of life. Apart from being clean, cheap and equitable mode of transport for short-distance journeys, cycling can potentially offer solutions to the problem of urban mobility. Many cities have tried promoting cycling particularly through the implementation of bike-sharing. Apparently the fourth generation bikesharing system has been promoted utilizing electric bicycles which considered as a clean technology implementation. Utilization of solar power is probably the development keys in the fourth generation bikesharing system and will become the standard in bikesharing system in the future. Electric bikes use batteries as a source of energy, thus they require a battery charger system which powered from the solar cells energy. This research aims to design and implement electric bicycle battery charging system with solar energy sources using fuzzy logic algorithm. It is necessary to develop an electric bicycle battery charging system with solar energy sources using fuzzy logic algorithm. The study was conducted by means of experimental method which includes the design, manufacture and testing controller systems. The designed fuzzy algorithm have been planted in EEPROM microcontroller ATmega8535. The charging current was set at 1.2 Amperes and the full charged battery voltage was observed to be 40 Volts. The results showed a fuzzy logic controller was able to maintain the charging current of 1.2 Ampere with an error rate of less than 5% around the set point. The process of charging electric bike lead acid batteries from empty to fully charged was 5 hours. In conclusion, the development of solar-powered electric bicycle controlled using fuzzy logic controller can keep the battery charging current in solar-powered electric bicycle to remain stable. This shows that the fuzzy algorithm can be used as a controller in the process of charging for a solar electric bicycle.

  6. [Advances in microbial production of alkaline polygalacturonate lyase and its application in clean production of textile industry].

    PubMed

    Liu, Long; Wang, Zhihao; Zhang, Dongxu; Li, Jianghua; Du, Guocheng; Chen, Jian

    2009-12-01

    We reviewed the microbial production of alkaline polygalacturonate lyase (PGL) and its application in the clean production of textile industry. Currently PGL is mainly produced by microbial fermentation and Bacillus sp. is an ideal wild strain for PGL production. Microbial PGL production was affected by many factors including the concentration and feeding mode of substrate, cell concentration, agitation speed, aeration rate, pH and temperature. Constructing the recombinant strain provided an effective alternative for PGL production, and the concentration of PGL produced by the recombinant Pichia pastoris reached 1305 U/mL in 10 m3 fermentor. The recombinant Pichia pastoris had the potential to reach the industrial production of PGL. PGL can be applied in bio-scouring process in the pre-treatment of cotton. Compared with the traditional alkaline cooking process, the application of PGL can protect fiber, improve the bio-scouring efficiency, decrease energy consumption and alleviate the environmental pollution. The future research focus will be the molecular directed evolution of PGL to make PGL more suitable for the application of PGL in bio-scouring process to realize the clean production of textile industry.

  7. Cleaning products and air fresheners: emissions and resulting concentrations of glycol ethers and terpenoids.

    PubMed

    Singer, B C; Destaillats, H; Hodgson, A T; Nazaroff, W W

    2006-06-01

    Experiments were conducted to quantify emissions and concentrations of glycol ethers and terpenoids from cleaning product and air freshener use in a 50-m3 room ventilated at approximately 0.5/h. Five cleaning products were applied full-strength (FS); three were additionally used in dilute solution. FS application of pine-oil cleaner (POC) yielded 1-h concentrations of 10-1300 microg/m3 for individual terpenoids, including alpha-terpinene (90-120), d-limonene (1000-1100), terpinolene (900-1300), and alpha-terpineol (260-700). One-hour concentrations of 2-butoxyethanol and/or d-limonene were 300-6000 microg/m3 after FS use of other products. During FS application including rinsing with sponge and wiping with towels, fractional emissions (mass volatilized/dispensed) of 2-butoxyethanol and d-limonene were 50-100% with towels retained, and approximately 25-50% when towels were removed after cleaning. Lower fractions (2-11%) resulted from dilute use. Fractional emissions of terpenes from FS use of POC were approximately 35-70% with towels retained, and 20-50% with towels removed. During floor cleaning with dilute solution of POC, 7-12% of dispensed terpenes were emitted. Terpene alcohols were emitted at lower fractions: 7-30% (FS, towels retained), 2-9% (FS, towels removed), and 2-5% (dilute). During air-freshener use, d-limonene, dihydromyrcenol, linalool, linalyl acetate, and beta-citronellol) were emitted at 35-180 mg/day over 3 days while air concentrations averaged 30-160 microg/m3. While effective cleaning can improve the healthfulness of indoor environments, this work shows that use of some consumer cleaning agents can yield high levels of volatile organic compounds, including glycol ethers--which are regulated toxic air contaminants--and terpenes that can react with ozone to form a variety of secondary pollutants including formaldehyde and ultrafine particles. Persons involved in cleaning, especially those who clean occupationally or often, might encounter excessive exposures to these pollutants owing to cleaning product emissions. Mitigation options include screening of product ingredients and increased ventilation during and after cleaning. Certain practices, such as the use of some products in dilute solution vs. full-strength and the prompt removal of cleaning supplies from occupied spaces, can reduce emissions and exposures to 2-butoxyethanol and other volatile constituents. Also, it may be prudent to limit use of products containing ozone-reactive constituents when indoor ozone concentrations are elevated either because of high ambient ozone levels or because of the indoor use of ozone-generating equipment.

  8. A Genetic Algorithm That Exchanges Neighboring Centers for Fuzzy c-Means Clustering

    ERIC Educational Resources Information Center

    Chahine, Firas Safwan

    2012-01-01

    Clustering algorithms are widely used in pattern recognition and data mining applications. Due to their computational efficiency, partitional clustering algorithms are better suited for applications with large datasets than hierarchical clustering algorithms. K-means is among the most popular partitional clustering algorithm, but has a major…

  9. Prediction of Protein-Protein Interaction Sites with Machine-Learning-Based Data-Cleaning and Post-Filtering Procedures.

    PubMed

    Liu, Guang-Hui; Shen, Hong-Bin; Yu, Dong-Jun

    2016-04-01

    Accurately predicting protein-protein interaction sites (PPIs) is currently a hot topic because it has been demonstrated to be very useful for understanding disease mechanisms and designing drugs. Machine-learning-based computational approaches have been broadly utilized and demonstrated to be useful for PPI prediction. However, directly applying traditional machine learning algorithms, which often assume that samples in different classes are balanced, often leads to poor performance because of the severe class imbalance that exists in the PPI prediction problem. In this study, we propose a novel method for improving PPI prediction performance by relieving the severity of class imbalance using a data-cleaning procedure and reducing predicted false positives with a post-filtering procedure: First, a machine-learning-based data-cleaning procedure is applied to remove those marginal targets, which may potentially have a negative effect on training a model with a clear classification boundary, from the majority samples to relieve the severity of class imbalance in the original training dataset; then, a prediction model is trained on the cleaned dataset; finally, an effective post-filtering procedure is further used to reduce potential false positive predictions. Stringent cross-validation and independent validation tests on benchmark datasets demonstrated the efficacy of the proposed method, which exhibits highly competitive performance compared with existing state-of-the-art sequence-based PPIs predictors and should supplement existing PPI prediction methods.

  10. The application of knowledge discovery in databases to post-marketing drug safety: example of the WHO database.

    PubMed

    Bate, A; Lindquist, M; Edwards, I R

    2008-04-01

    After market launch, new information on adverse effects of medicinal products is almost exclusively first highlighted by spontaneous reporting. As data sets of spontaneous reports have become larger, and computational capability has increased, quantitative methods have been increasingly applied to such data sets. The screening of such data sets is an application of knowledge discovery in databases (KDD). Effective KDD is an iterative and interactive process made up of the following steps: developing an understanding of an application domain, creating a target data set, data cleaning and pre-processing, data reduction and projection, choosing the data mining task, choosing the data mining algorithm, data mining, interpretation of results and consolidating and using acquired knowledge. The process of KDD as it applies to the analysis of spontaneous reports can be exemplified by its routine use on the 3.5 million suspected adverse drug reaction (ADR) reports in the WHO ADR database. Examples of new adverse effects first highlighted by the KDD process on WHO data include topiramate glaucoma, infliximab vasculitis and the association of selective serotonin reuptake inhibitors (SSRIs) and neonatal convulsions. The KDD process has already improved our ability to highlight previously unsuspected ADRs for clinical review in spontaneous reporting, and we anticipate that such techniques will be increasingly used in the successful screening of other healthcare data sets such as patient records in the future.

  11. SRF Training Workshop Support

    EPA Pesticide Factsheets

    EPA is soliciting applications from eligible applicants to provide training workshop support activities for the State Revolving Fund (SRF) programs, the Clean Water SRF program and the Drinking Water SRF programs.

  12. Design of underwater superoleophobic TiO{sub 2} coatings with additional photo-induced self-cleaning properties by one-step route bio-inspired from fish scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hao; Guo, Zhiguang, E-mail: zguo@licp.cas.cn; State Key Laboratory of Solid Lubrication, Lanzhou Institute of Chemical Physics, Chinese Academy of Sciences, Lanzhou 730000

    Self-cleaning properties inspired by the structures and functions of some creatures are of great interest since the late 20th century. In this paper, TiO{sub 2} coatings with hierarchical rutile TiO{sub 2} flowers on fluorine-doped tin oxide substrate are fabricated through a simple one-step hydrothermal method. The flower-like coatings exhibit superhydrophilicity in air and superoleophobicity underwater with a contact angle as high as 157°, presenting good underwater self-cleaning performance. In addition, when contaminated by oleic acid, the as-prepared TiO{sub 2} coatings also exhibit excellent photocatalytic capability under ultraviolet irradiation, which demonstrated self-cleaning properties in a different way. This self-cleaning film providesmore » a good strategy for some industrial and ocean applications.« less

  13. Characterization of Laser Cleaning of Artworks

    PubMed Central

    Marczak, Jan; Koss, Andrzej; Targowski, Piotr; Góra, Michalina; Strzelec, Marek; Sarzyński, Antoni; Skrzeczanowski, Wojciech; Ostrowski, Roman; Rycyk, Antoni

    2008-01-01

    The main tasks of conservators of artworks and monuments are the estimation and analysis of damages (present condition), object conservation (cleaning process), and the protection of an object against further degradation. One of the physical methods that is becoming more and more popular for dirt removal is the laser cleaning method. This method is non-contact, selective, local, controlled, self-limiting, gives immediate feedback and preserves even the gentlest of relief - the trace of a paintbrush. Paper presents application of different, selected physical sensing methods to characterize condition of works of art as well as laser cleaning process itself. It includes, tested in our laboratories, optical surface measurements (e.g. colorimetry, scatterometry, interferometry), infrared thermography, optical coherent tomography and acoustic measurements for “on-line” evaluation of cleaning progress. Results of laser spectrometry analyses (LIBS, Raman) will illustrate identification and dating of objects superficial layers. PMID:27873884

  14. Using Field Measurements to Assess Aging of Self-Cleaning High-Reflectance Paint

    NASA Astrophysics Data System (ADS)

    Takebayashi, Hideki; Tanabe, Junichiro; Aoyama, Taizo; Sonoda, Takeshi; Nakanishi, Yasushi

    2017-08-01

    Continuous field measurements were used to evaluate the aging of solar reflectance on self-cleaning coatings for roofs in comparison with conventional coatings that have no self-cleaning function. Solar reflectance on self-cleaning coatings decreases by about 6 % per year with annual variations, due to the adhesion of dirt. On the other hand, solar reflectance on conventional coatings greatly decreases, by approximately 18 % within four months of the coating's application, due to the adhesion of dirt. Then, it gradually recovers at a rate of about 4 % per year, with annual variations, due to degradation of the coating. It is due to degradation of the conventional coating that the difference of solar reflectance between the self-cleaning coating and the conventional coating becomes almost zero in two years. Both the adhesion of dirt and coating degradation by chalking affect the temporal change of solar reflectance with annual variation.

  15. Bioinspired Surface for Low Drag, Self-Cleaning, and Antifouling: Shark Skin, Butterfly and Rice Leaf Effects

    NASA Astrophysics Data System (ADS)

    Bixler, Gregroy D.

    In this thesis, first presented is an overview of inorganic-fouling and biofouling which is generally undesirable for many medical, marine, and industrial applications. A survey of nature's flora and fauna are studied in order to discover new antifouling methods that could be mimicked for engineering applications. New antifouling methods will presumably incorporate a combination of physical and chemical controls. Presented are mechanisms and experimental results focusing on laminar and turbulent drag reducing shark skin inspired riblet surfaces. This includes new laser etched and riblet film samples for closed channel drag using water, oil, and air as well as in wind tunnel. Also presented are mechanisms and experimental results focusing on the newly discovered rice and butterfly wing effect surfaces. Morphology, drag, self-cleaning, contact angle, and contact angle hysteresis data are presented to understand the role of sample geometrical dimensions, wettability, viscosity, and velocity. Hierarchical liquid repellent coatings combining nano- and micro-sized features and particles are utilized to recreate or combine various effects. Such surfaces have been fabricated with photolithography, soft lithography, hot embossing, and coating techniques. Discussion is provided along with new conceptual models describing the role of surface structures related to low drag, self-cleaning, and antifouling properties. Modeling provides design guidance when developing novel low drag and self-cleaning surfaces for medical, marine, and industrial applications.

  16. Inhalation exposure to cleaning products: application of a two-zone model.

    PubMed

    Earnest, C Matt; Corsi, Richard L

    2013-01-01

    In this study, modifications were made to previously applied two-zone models to address important factors that can affect exposures during cleaning tasks. Specifically, we expand on previous applications of the two-zone model by (1) introducing the source in discrete elements (source-cells) as opposed to a complete instantaneous release, (2) placing source cells in both the inner (near person) and outer zones concurrently, (3) treating each source cell as an independent mixture of multiple constituents, and (4) tracking the time-varying liquid concentration and emission rate of each constituent in each source cell. Three experiments were performed in an environmentally controlled chamber with a thermal mannequin and a simplified pure chemical source to simulate emissions from a cleaning product. Gas phase concentration measurements were taken in the bulk air and in the breathing zone of the mannequin to evaluate the model. The mean ratio of the integrated concentration in the mannequin's breathing zone to the concentration in the outer zone was 4.3 (standard deviation, σ = 1.6). The mean ratio of measured concentration in the breathing zone to predicted concentrations in the inner zone was 0.81 (σ = 0.16). Intake fractions ranged from 1.9 × 10(-3) to 2.7 × 10(-3). Model results reasonably predict those of previous exposure monitoring studies and indicate the inadequacy of well-mixed single-zone model applications for some but not all cleaning events.

  17. Robust and Superhydrophobic Surface Modification by a "Paint + Adhesive" Method: Applications in Self-Cleaning after Oil Contamination and Oil-Water Separation.

    PubMed

    Chen, Baiyi; Qiu, Jianhui; Sakai, Eiichi; Kanazawa, Nobuhiro; Liang, Ruilu; Feng, Huixia

    2016-07-13

    Conventional superhydrophobic surfaces have always depended on expensive, sophisticated, and fragile roughness structures. Therefore, poor robustness has turned into the bottleneck for large-scale industrial applications of the superhydrophobic surfaces. To handle this problem, a superhydrophobic surface with firm robustness urgently needs to be developed. In this work, we created a versatile strategy to fabricate robust, self-cleaning, and superhydrophobic surfaces for both soft and hard substrates. We created an ethanol based suspension of perfluorooctyltriethoxysilane-mdodified calcium carbonate nanoparticles which can be sprayed onto both hard and soft substrates to form superhydrophobic surfaces. For all kinds of substrates, spray adhesive was directly coated onto abluent substrate surfaces to promote the robustness. These superhydrophobic surfaces showed remarkable robustness against knife scratch and sandpaper abrasion, while retaining its superhydrophobicity even after 30 abrasion cycles with sandpaper. What is more, the superhydrophobic surfaces have shown promising potential applications in self-cleaning and oil-water separation. The surfaces retained their self-cleaning property even immersed in oil. In addition to oil-water separation, the water contents in oil after separation of various mixtures were all below 150 ppm, and for toluene even as low as 55 ppm. Furthermore, the as-prepared device for oil-water separation could be cycled 6 times and still retained excellent oil-water separation efficiency.

  18. An evaluation of alternative cleaning methods for removing an organic contaminant from a stainless steel part

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boyd, J.L.

    1996-08-01

    As of December 1995, the manufacture of Freon, along with many other chlorofluorocarbons (CFCs), was prohibited by the Clean Air Act of 1990 (CAA). The ban of CFC solvents has forced manufacturers across the country to search for alternative metal cleaning techniques. The objective of this study was to develop a thorough, scientific based approach for resolving one specific manufacturer`s problem of removing organic contamination from a stainless steel part. This objective was accomplished with an approach that involved: (1) defining the problem, (2) identifying the process constraints, (3) researching alternate cleaning methods, (4) researching applicable government regulations, (5) performingmore » a scientific evaluation and (6) drawing conclusions.« less

  19. Si /SiGe n-type resonant tunneling diodes fabricated using in situ hydrogen cleaning

    NASA Astrophysics Data System (ADS)

    Suet, Z.; Paul, D. J.; Zhang, J.; Turner, S. G.

    2007-05-01

    In situ hydrogen cleaning to reduce the surface segregation of n-type dopants in SiGe epitaxy has been used to fabricate Si /SiGe resonant tunneling diodes in a joint gas source chemical vapor deposition and molecular beam epitaxial system. Diodes fabricated without the in situ clean demonstrate linear current-voltage characteristics, while a 15min hydrogen clean produces negative differential resistance with peak-to-valley current ratios up to 2.2 and peak current densities of 5.0A/cm2 at 30K. Analysis of the valley current and the band structure of the devices suggest methods for increasing the operating temperature of Si /SiGe resonant tunneling diodes as required for applications.

  20. Implementation of environmentally compliant cleaning and insulation bonding for MNASA

    NASA Technical Reports Server (NTRS)

    Hutchens, Dale E.; Keen, Jill M.; Smith, Gary M.; Dillard, Terry W.; Deweese, C. Darrell; Lawson, Seth W.

    1995-01-01

    Historically, many subscale and full-scale rocket motors have employed environmentally and physiologically harmful chemicals during the manufacturing process. This program examines the synergy and interdependency between environmentally acceptable materials for solid rocket motor insulation applications, bonding, corrosion inhibiting, painting, priming, and cleaning, and then implements new materials and processes in subscale motors. Tests have been conducted to eliminate or minimize hazardous chemicals used in the manufacture of modified-NASA materials test motor (MNASA) components and identify alternate materials and/or processes following NASA Operational Environmental Team (NOET) priorities. This presentation describes implementation of high pressure water refurbishment cleaning, aqueous precision cleaning using both Brulin 815 GD and Jettacin, and insulation case bonding using ozone depleting chemical (ODC) compliant primers and adhesives.

  1. Robust self-cleaning surfaces that function when exposed to either air or oil

    NASA Astrophysics Data System (ADS)

    Lu, Yao; Sathasivam, Sanjayan; Song, Jinlong; Crick, Colin R.; Carmalt, Claire J.; Parkin, Ivan P.

    2015-03-01

    Superhydrophobic self-cleaning surfaces are based on the surface micro/nanomorphologies; however, such surfaces are mechanically weak and stop functioning when exposed to oil. We have created an ethanolic suspension of perfluorosilane-coated titanium dioxide nanoparticles that forms a paint that can be sprayed, dipped, or extruded onto both hard and soft materials to create a self-cleaning surface that functions even upon emersion in oil. Commercial adhesives were used to bond the paint to various substrates and promote robustness. These surfaces maintained their water repellency after finger-wipe, knife-scratch, and even 40 abrasion cycles with sandpaper. The formulations developed can be used on clothes, paper, glass, and steel for a myriad of self-cleaning applications.

  2. 77 FR 7148 - Notice of Approval of Clean Air Act Outer Continental Shelf Permits Issued to Shell Gulf of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-10

    ... Shelf Permits Issued to Shell Gulf of Mexico, Inc., and Shell Offshore, Inc. for the Discoverer... Clean Air Act Outer Continental Shelf (OCS) permit applications, one from Shell Gulf of Mexico, Inc., for operation of the Discoverer drillship in the Chukchi Sea and one from Shell Offshore, Inc...

  3. Metal sponge for cryosorption pumping applications

    DOEpatents

    Myneni, Ganapati R.; Kneisel, Peter

    1995-01-01

    A system has been developed for adsorbing gases at high vacuum in a closed area. The system utilizes large surface clean anodized metal surfaces at low temperatures to adsorb the gases. The large surface clean anodized metal is referred to as a metal sponge. The metal sponge generates or maintains the high vacuum by increasing the available active cryosorbing surface area.

  4. Quantitative cleaning characterization of a lithium-fluoride ion diode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menge, P.R.; Cuneo, M.E.

    An ion source cleaning testbed was created to test plasma-cleaning techniques, and to provide quantitative data on plasma-cleaning protocols prior to implementation on the SABRE accelerator. The testbed was designed to resolve issues regarding the quantity of contaminants absorbed by the anode source (LiF), and the best cleaning methodology. A test chamber was devised containing a duplicate of the SABRE diode. Radio-frequency (RF) power was fed to the anode, which was isolated from ground and thus served as the plasma discharge electrode. RF plasma discharges in 1--3 mtorr of Ar with 10% O{sub 2} were found to provide the bestmore » cleaning of the LiF surface. X-ray photoelectron spectroscopy (XPS) showed that the LiF could accrue dozens of monolayers of carbon just by sitting in a 2 {times} 10{sup {minus}5} vacuum for 24 h. Tests of various discharge cleaning protocols indicated that 15 min of an Ar/O{sub 2} discharge was sufficient to reduce this initial 13--45 monolayers of carbon impurities to 2--4 monolayers. Rapid recontamination of the LiF was also observed. Up to ten monolayers of carbon returned in 2 min after termination of the plasma discharge and subsequent pumping back to the 10{sup {minus}5} torr range. Heating of the LiF also was found to provide anode cleaning. Application of heating combined with plasma cleaning provided the highest cleaning rates.« less

  5. Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images

    PubMed Central

    Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu

    2013-01-01

    With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856

  6. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    NASA Astrophysics Data System (ADS)

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed

    2016-11-01

    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  7. Does the Use of Clean or Sterile Dressing Technique Affect the Incidence of Wound Infection?

    PubMed

    Kent, Dea J; Scardillo, Jody N; Dale, Barbara; Pike, Caitlin

    The purpose of this article is to examine the evidence and provide recommendations for the use of clean or sterile dressing technique with dressing application to prevent wound infection. In all persons with acute or chronic wounds, does the use of clean or sterile dressing technique affect incidence of wound infection? A search of the literature was performed by a trained university librarian, which resulted in 473 articles that examined any age group that dealt with application of a wound dressing using either sterile or nonsterile technique. A systematic approach was used to review titles, abstracts, and text, yielding 4 studies that met inclusion criteria. Strength of the evidence was rated using rating methodology from Essential Evidence Plus: Levels of Evidence and Oxford Center for Evidence-Based Medicine, adapted by Gray and colleagues. Johns Hopkins Nursing Evidence-Based Practice Nursing Research Appraisal Tool was used to rate the quality of the evidence. All 4 studies reported no significant difference in the rate of wound infection when using either clean or sterile technique with dressing application. The strength of the evidence for the identified studies was identified as level 2 (1 level A, 3 level B). The study sizes were variable, and the wounds included do not represent the continuum of wounds clinically encountered across the board. Evidence indicates that the use of clean technique for acute wound care is a clinically effective intervention that does not affect the incidence of infection. There is no recommendation that can be made regarding type of dressing technique for a chronic wound due to the lack of evidence in the literature.

  8. Issues and approaches for ensuring effective communication on acceptable daily exposure (ADE) values applied to pharmaceutical cleaning.

    PubMed

    Olson, Michael J; Faria, Ellen C; Hayes, Eileen P; Jolly, Robert A; Barle, Ester Lovsin; Molnar, Lance R; Naumann, Bruce D; Pecquet, Alison M; Shipp, Bryan K; Sussman, Robert G; Weideman, Patricia A

    2016-08-01

    This manuscript centers on communication with key stakeholders of the concepts and program goals involved in the application of health-based pharmaceutical cleaning limits. Implementation of health-based cleaning limits, as distinct from other standards such as 1/1000th of the lowest clinical dose, is a concept recently introduced into regulatory domains. While there is a great deal of technical detail in the written framework underpinning the use of Acceptable Daily Exposures (ADEs) in cleaning (for example ISPE, 2010; Sargent et al., 2013), little is available to explain how to practically create a program which meets regulatory needs while also fulfilling good manufacturing practice (GMP) and other expectations. The lack of a harmonized approach for program implementation and communication across stakeholders can ultimately foster inappropriate application of these concepts. Thus, this period in time (2014-2017) could be considered transitional with respect to influencing best practice related to establishing health-based cleaning limits. Suggestions offered in this manuscript are intended to encourage full and accurate communication regarding both scientific and administrative elements of health-based ADE values used in pharmaceutical cleaning practice. This is a large and complex effort that requires: 1) clearly explaining key terms and definitions, 2) identification of stakeholders, 3) assessment of stakeholders' subject matter knowledge, 4) formulation of key messages fit to stakeholder needs, 5) identification of effective and timely means for communication, and 6) allocation of time, energy, and motivation for initiating and carrying through with communications. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Onboard tagging for real-time quality assessment of photoplethysmograms acquired by a wireless reflectance pulse oximeter.

    PubMed

    Li, Kejia; Warren, Steve; Natarajan, Balasubramaniam

    2012-02-01

    Onboard assessment of photoplethysmogram (PPG) quality could reduce unnecessary data transmission on battery-powered wireless pulse oximeters and improve the viability of the electronic patient records to which these data are stored. These algorithms show promise to increase the intelligence level of former "dumb" medical devices: devices that acquire and forward data but leave data interpretation to the clinician or host system. To this end, the authors have developed a unique onboard feature detection algorithm to assess the quality of PPGs acquired with a custom reflectance mode, wireless pulse oximeter. The algorithm uses a Bayesian hypothesis testing method to analyze four features extracted from raw and decimated PPG data in order to determine whether the original data comprise valid PPG waveforms or whether they are corrupted by motion or other environmental influences. Based on these results, the algorithm further calculates heart rate and blood oxygen saturation from a "compact representation" structure. PPG data were collected from 47 subjects to train the feature detection algorithm and to gauge their performance. A MATLAB interface was also developed to visualize the features extracted, the algorithm flow, and the decision results, where all algorithm-related parameters and decisions were ascertained on the wireless unit prior to transmission. For the data sets acquired here, the algorithm was 99% effective in identifying clean, usable PPGs versus nonsaturated data that did not demonstrate meaningful pulsatile waveshapes, PPGs corrupted by motion artifact, and data affected by signal saturation.

  10. Application of simplified bioclean apparatuses for treatment of acute leukemia.

    PubMed

    Hasegawa, H; Horiuchi, A

    1983-01-01

    We used a portable horizontal laminar-air-flow clean bed and an open horizontal laminar-air-flow fan (clean wall unit) for treating patients with acute leukemia. The level of cleanliness as shown in the nonviable and viable particle counts was class 100 and class 1,000 at the head and foot, respectively, of the bed in the clean-bed rooms, while it was class 100 and class 10,000 respectively, in the clean-wall-unit rooms. The level of cleanliness in the open wards, on the other hand, was class 1,000,000. The incidence of infectious complications in the clean-bed rooms was 3.1/100 days when the granulocyte count was 1,000/mm3 or less, 3.9/100 days when the count was 500/mm3 or less and 6.1/100 days when it was 100/mm3 or less. In the clean-wall-unit rooms, these values were 3.1, 3.7 and 7.1, respectively, while in the open wards they were 4.6, 6.1 and 15.0. Thus, it was ascertained that, as the granulocyte count decreased, the incidence of infectious complications became significantly higher in the open wards than in the clean-bed rooms or the clean-wall-unit rooms. No complication of pneumonia was found in 37 patients with acute leukemia in the clean-bed rooms or in 40 in the clean-wall-unit rooms. Among 36 patients treated in the open wards, on the other hand, the complication of pneumonia was found in four. From the above results, it is believed that the use of clean-bed rooms or clean-wall-unit rooms is an extremely effective supplementary treatment method for preventing respiratory tract infection complications in patients with acute leukemia.

  11. Reconstruction of noisy and blurred images using blur kernel

    NASA Astrophysics Data System (ADS)

    Ellappan, Vijayan; Chopra, Vishal

    2017-11-01

    Blur is a common in so many digital images. Blur can be caused by motion of the camera and scene object. In this work we proposed a new method for deblurring images. This work uses sparse representation to identify the blur kernel. By analyzing the image coordinates Using coarse and fine, we fetch the kernel based image coordinates and according to that observation we get the motion angle of the shaken or blurred image. Then we calculate the length of the motion kernel using radon transformation and Fourier for the length calculation of the image and we use Lucy Richardson algorithm which is also called NON-Blind(NBID) Algorithm for more clean and less noisy image output. All these operation will be performed in MATLAB IDE.

  12. On a Chirplet Transform Based Method for Co-channel Voice Separation

    NASA Astrophysics Data System (ADS)

    Dugnol, B.; Fernández, C.; Galiano, G.; Velasco, J.

    We use signal and image theory based algorithms to produce estimations of the number of wolves emitting howls or barks in a given field recording as an individuals counting alternative to the traditional trace collecting methodologies. We proceed in two steps. Firstly, we clean and enhance the signal by using PDE based image processing algorithms applied to the signal spectrogram. Secondly, assuming that the wolves chorus may be modelled as an addition of nonlinear chirps, we use the quadratic energy distribution corresponding to the Chirplet Transform of the signal to produce estimates of the corresponding instantaneous frequencies, chirp-rates and amplitudes at each instant of the recording. We finally establish suitable criteria to decide how such estimates are connected in time.

  13. Combining approaches to on-line handwriting information retrieval

    NASA Astrophysics Data System (ADS)

    Peña Saldarriaga, Sebastián; Viard-Gaudin, Christian; Morin, Emmanuel

    2010-01-01

    In this work, we propose to combine two quite different approaches for retrieving handwritten documents. Our hypothesis is that different retrieval algorithms should retrieve different sets of documents for the same query. Therefore, significant improvements in retrieval performances can be expected. The first approach is based on information retrieval techniques carried out on the noisy texts obtained through handwriting recognition, while the second approach is recognition-free using a word spotting algorithm. Results shows that for texts having a word error rate (WER) lower than 23%, the performances obtained with the combined system are close to the performances obtained on clean digital texts. In addition, for poorly recognized texts (WER > 52%), an improvement of nearly 17% can be observed with respect to the best available baseline method.

  14. Morphometric analysis of root canal cleaning after rotary instrumentation with or without laser irradiation

    NASA Astrophysics Data System (ADS)

    Marchesan, Melissa A.; Geurisoli, Danilo M. Z.; Brugnera, Aldo, Jr.; Barbin, Eduardo L.; Pecora, Jesus D.

    2002-06-01

    The present study examined root canal cleaning, using the optic microscope, after rotary instrumentation with ProFile.04 with or without laser application with different output energies. Cleaning and shaping can be accomplished manually, with ultra-sonic and sub-sonic devices, with rotary instruments and recently, increasing development in laser radiation has shown promising results for disinfection and smear layer removal. In this study, 30 palatal maxillary molar roots were examined using an optic microscope after rotary instrumentation with ProFile .04 with or without Er:YAG laser application (KaVo KeyLaser II, Germany) with different output energies (2940 nm, 15 Hz, 300 pulses, 500 milli-sec duration, 42 J, 140 mJ showed on the display- input, 61 mJ at fiberoptic tip-output and 140 mJ showed on the display-input and 51 mJ at fiberoptic tip-output). Statistical analysis showed no statistical differences between the tested treatments (ANOVA, p>0.05). ANOVA also showed a statistically significant difference (p<0.01) between the root canal thirds, indicating that the middle third had less debris than the apical third. We conclude that: 1) none of the tested treatments led to totally cleaned root canals; 2) all treatments removed debris similarly, 3) the middle third had less debris than the apical third; 4) variation in output energy did not increase cleaning.

  15. Wet particle source identification and reduction using a new filter cleaning process

    NASA Astrophysics Data System (ADS)

    Umeda, Toru; Morita, Akihiko; Shimizu, Hideki; Tsuzuki, Shuichi

    2014-03-01

    Wet particle reduction during filter installation and start-up aligns closely with initiatives to reduce both chemical consumption and preventative maintenance time. The present study focuses on the effects of filter materials cleanliness on wet particle defectivity through evaluation of filters that have been treated with a new enhanced cleaning process focused on organic compounds reduction. Little difference in filter performance is observed between the two filter types at a size detection threshold of 60 nm, while clear differences are observed at that of 26 nm. It can be suggested that organic compounds can be identified as a potential source of wet particles. Pall recommends filters that have been treated with the special cleaning process for applications with a critical defect size of less than 60 nm. Standard filter products are capable to satisfy wet particle defect performance criteria in less critical lithography applications.

  16. Scalable Multifunctional Ultra-thin Graphite Sponge: Free-standing, Superporous, Superhydrophobic, Oleophilic Architecture with Ferromagnetic Properties for Environmental Cleaning

    NASA Astrophysics Data System (ADS)

    Bay, Hamed Hosseini; Patino, Daisy; Mutlu, Zafer; Romero, Paige; Ozkan, Mihrimah; Ozkan, Cengiz S.

    2016-02-01

    Water decontamination and oil/water separation are principal motives in the surge to develop novel means for sustainability. In this prospect, supplying clean water for the ecosystems is as important as the recovery of the oil spills since the supplies are scarce. Inspired to design an engineering material which not only serves this purpose, but can also be altered for other applications to preserve natural resources, a facile template-free process is suggested to fabricate a superporous, superhydrophobic ultra-thin graphite sponge. Moreover, the process is designed to be inexpensive and scalable. The fabricated sponge can be used to clean up different types of oil, organic solvents, toxic and corrosive contaminants. This versatile microstructure can retain its functionality even when pulverized. The sponge is applicable for targeted sorption and collection due to its ferromagnetic properties. We hope that such a cost-effective process can be embraced and implemented widely.

  17. Building upon Historical Competencies: Next-generation Clean-up Technologies for World-Wide Application - 13368

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guevara, K.C.; Fellinger, A.P.; Aylward, R.S.

    The Department of Energy's Savannah River Site has a 60-year history of successfully operating nuclear facilities and cleaning up the nuclear legacy of the Cold War era through the processing of radioactive and otherwise hazardous wastes, remediation of contaminated soil and groundwater, management of nuclear materials, and deactivation and decommissioning of excess facilities. SRS recently unveiled its Enterprise.SRS (E.SRS) strategic vision to identify and facilitate application of the historical competencies of the site to current and future national and global challenges. E.SRS initiatives such as the initiative to Develop and Demonstrate Next generation Clean-up Technologies seek timely and mutually beneficialmore » engagements with entities around the country and the world. One such ongoing engagement is with government and industry in Japan in the recovery from the devastation of the Fukushima Daiichi Nuclear Power Station. (authors)« less

  18. A review of polymer nanofibres by electrospinning and their application in oil-water separation for cleaning up marine oil spills.

    PubMed

    Sarbatly, Rosalam; Krishnaiah, Duduku; Kamin, Zykamilia

    2016-05-15

    The growths of oil and gas exploration and production activities have increased environmental problems, such as oil spillage and the resulting pollution. The study of the methods for cleaning up oil spills is a critical issue to protect the environment. Various techniques are available to contain oil spills, but they are typically time consuming, energy inefficient and create secondary pollution. The use of a sorbent, such as a nanofibre sorbent, is a technique for controlling oil spills because of its good physical and oil sorption properties. This review discusses about the application of nanofibre sorbent for oil removal from water and its current developments. With their unique physical and mechanical properties coupled with their very high surface area and small pore sizes, nanofibre sorbents are alternative materials for cleaning up oil spills. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Characterization of occupational exposures to cleaning products used for common cleaning tasks--a pilot study of hospital cleaners.

    PubMed

    Bello, Anila; Quinn, Margaret M; Perry, Melissa J; Milton, Donald K

    2009-03-27

    In recent years, cleaning has been identified as an occupational risk because of an increased incidence of reported respiratory effects, such as asthma and asthma-like symptoms among cleaning workers. Due to the lack of systematic occupational hygiene analyses and workplace exposure data, it is not clear which cleaning-related exposures induce or aggravate asthma and other respiratory effects. Currently, there is a need for systematic evaluation of cleaning products ingredients and their exposures in the workplace. The objectives of this work were to: a) identify cleaning products' ingredients of concern with respect to respiratory and skin irritation and sensitization; and b) assess the potential for inhalation and dermal exposures to these ingredients during common cleaning tasks. We prioritized ingredients of concern in cleaning products commonly used in several hospitals in Massachusetts. Methods included workplace interviews, reviews of product Materials Safety Data Sheets and the scientific literature on adverse health effects to humans, reviews of physico-chemical properties of cleaning ingredients, and occupational hygiene observational analyses. Furthermore, the potential for exposure in the workplace was assessed by conducting qualitative assessment of airborne exposures and semi-quantitative assessment of dermal exposures. Cleaning products used for common cleaning tasks were mixtures of many chemicals, including respiratory and dermal irritants and sensitizers. Examples of ingredients of concern include quaternary ammonium compounds, 2-butoxyethanol, and ethanolamines. Cleaning workers are at risk of acute and chronic inhalation exposures to volatile organic compounds (VOC) vapors and aerosols generated from product spraying, and dermal exposures mostly through hands. Cleaning products are mixtures of many chemical ingredients that may impact workers' health through air and dermal exposures. Because cleaning exposures are a function of product formulations and product application procedures, a combination of product evaluation with workplace exposure assessment is critical in developing strategies for protecting workers from cleaning hazards. Our task based assessment methods allowed classification of tasks in different exposure categories, a strategy that can be employed by epidemiological investigations related to cleaning. The methods presented here can be used by occupational and environmental health practitioners to identify intervention strategies.

  20. Characterization of occupational exposures to cleaning products used for common cleaning tasks-a pilot study of hospital cleaners

    PubMed Central

    2009-01-01

    Background In recent years, cleaning has been identified as an occupational risk because of an increased incidence of reported respiratory effects, such as asthma and asthma-like symptoms among cleaning workers. Due to the lack of systematic occupational hygiene analyses and workplace exposure data, it is not clear which cleaning-related exposures induce or aggravate asthma and other respiratory effects. Currently, there is a need for systematic evaluation of cleaning products ingredients and their exposures in the workplace. The objectives of this work were to: a) identify cleaning products' ingredients of concern with respect to respiratory and skin irritation and sensitization; and b) assess the potential for inhalation and dermal exposures to these ingredients during common cleaning tasks. Methods We prioritized ingredients of concern in cleaning products commonly used in several hospitals in Massachusetts. Methods included workplace interviews, reviews of product Materials Safety Data Sheets and the scientific literature on adverse health effects to humans, reviews of physico-chemical properties of cleaning ingredients, and occupational hygiene observational analyses. Furthermore, the potential for exposure in the workplace was assessed by conducting qualitative assessment of airborne exposures and semi-quantitative assessment of dermal exposures. Results Cleaning products used for common cleaning tasks were mixtures of many chemicals, including respiratory and dermal irritants and sensitizers. Examples of ingredients of concern include quaternary ammonium compounds, 2-butoxyethanol, and ethanolamines. Cleaning workers are at risk of acute and chronic inhalation exposures to volatile organic compounds (VOC) vapors and aerosols generated from product spraying, and dermal exposures mostly through hands. Conclusion Cleaning products are mixtures of many chemical ingredients that may impact workers' health through air and dermal exposures. Because cleaning exposures are a function of product formulations and product application procedures, a combination of product evaluation with workplace exposure assessment is critical in developing strategies for protecting workers from cleaning hazards. Our task based assessment methods allowed classification of tasks in different exposure categories, a strategy that can be employed by epidemiological investigations related to cleaning. The methods presented here can be used by occupational and environmental health practitioners to identify intervention strategies. PMID:19327131

  1. Is Respiration-Induced Variation in the Photoplethysmogram Associated with Major Hypovolemia in Patients with Acute Traumatic Injuries?

    DTIC Science & Technology

    2010-11-01

    hypovolemia in the prehospital environment. Photoplethysmogram waveforms and basic vital signs were recorded in trauma patients during prehospital...transport. Retrospectively, we used automated algorithms to select patient records with all five basic vital signs and 45 s or longer continuous, clean PPG... basic vital signs by applying multivariate regression. In 344 patients, RIWV max-min yielded areas under the ROC curves (AUCs) not significantly better

  2. Minimum-Time Consensus-Based Approach for Power System Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Tao; Wu, Di; Sun, Yannan

    2016-02-01

    This paper presents minimum-time consensus based distributed algorithms for power system applications, such as load shedding and economic dispatch. The proposed algorithms are capable of solving these problems in a minimum number of time steps instead of asymptotically as in most of existing studies. Moreover, these algorithms are applicable to both undirected and directed communication networks. Simulation results are used to validate the proposed algorithms.

  3. The cascaded moving k-means and fuzzy c-means clustering algorithms for unsupervised segmentation of malaria images

    NASA Astrophysics Data System (ADS)

    Abdul-Nasir, Aimi Salihah; Mashor, Mohd Yusoff; Halim, Nurul Hazwani Abd; Mohamed, Zeehaida

    2015-05-01

    Malaria is a life-threatening parasitic infectious disease that corresponds for nearly one million deaths each year. Due to the requirement of prompt and accurate diagnosis of malaria, the current study has proposed an unsupervised pixel segmentation based on clustering algorithm in order to obtain the fully segmented red blood cells (RBCs) infected with malaria parasites based on the thin blood smear images of P. vivax species. In order to obtain the segmented infected cell, the malaria images are first enhanced by using modified global contrast stretching technique. Then, an unsupervised segmentation technique based on clustering algorithm has been applied on the intensity component of malaria image in order to segment the infected cell from its blood cells background. In this study, cascaded moving k-means (MKM) and fuzzy c-means (FCM) clustering algorithms has been proposed for malaria slide image segmentation. After that, median filter algorithm has been applied to smooth the image as well as to remove any unwanted regions such as small background pixels from the image. Finally, seeded region growing area extraction algorithm has been applied in order to remove large unwanted regions that are still appeared on the image due to their size in which cannot be cleaned by using median filter. The effectiveness of the proposed cascaded MKM and FCM clustering algorithms has been analyzed qualitatively and quantitatively by comparing the proposed cascaded clustering algorithm with MKM and FCM clustering algorithms. Overall, the results indicate that segmentation using the proposed cascaded clustering algorithm has produced the best segmentation performances by achieving acceptable sensitivity as well as high specificity and accuracy values compared to the segmentation results provided by MKM and FCM algorithms.

  4. A Fast parallel tridiagonal algorithm for a class of CFD applications

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Sun, Xian-He

    1996-01-01

    The parallel diagonal dominant (PDD) algorithm is an efficient tridiagonal solver. This paper presents for study a variation of the PDD algorithm, the reduced PDD algorithm. The new algorithm maintains the minimum communication provided by the PDD algorithm, but has a reduced operation count. The PDD algorithm also has a smaller operation count than the conventional sequential algorithm for many applications. Accuracy analysis is provided for the reduced PDD algorithm for symmetric Toeplitz tridiagonal (STT) systems. Implementation results on Langley's Intel Paragon and IBM SP2 show that both the PDD and reduced PDD algorithms are efficient and scalable.

  5. A robust hidden Markov Gauss mixture vector quantizer for a noisy source.

    PubMed

    Pyun, Kyungsuk Peter; Lim, Johan; Gray, Robert M

    2009-07-01

    Noise is ubiquitous in real life and changes image acquisition, communication, and processing characteristics in an uncontrolled manner. Gaussian noise and Salt and Pepper noise, in particular, are prevalent in noisy communication channels, camera and scanner sensors, and medical MRI images. It is not unusual for highly sophisticated image processing algorithms developed for clean images to malfunction when used on noisy images. For example, hidden Markov Gauss mixture models (HMGMM) have been shown to perform well in image segmentation applications, but they are quite sensitive to image noise. We propose a modified HMGMM procedure specifically designed to improve performance in the presence of noise. The key feature of the proposed procedure is the adjustment of covariance matrices in Gauss mixture vector quantizer codebooks to minimize an overall minimum discrimination information distortion (MDI). In adjusting covariance matrices, we expand or shrink their elements based on the noisy image. While most results reported in the literature assume a particular noise type, we propose a framework without assuming particular noise characteristics. Without denoising the corrupted source, we apply our method directly to the segmentation of noisy sources. We apply the proposed procedure to the segmentation of aerial images with Salt and Pepper noise and with independent Gaussian noise, and we compare our results with those of the median filter restoration method and the blind deconvolution-based method, respectively. We show that our procedure has better performance than image restoration-based techniques and closely matches to the performance of HMGMM for clean images in terms of both visual segmentation results and error rate.

  6. National Renewable Energy Laboratory (NREL) Topic 2 Final Report: End-to-End Communication and Control System to Support Clean Energy Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudgins, Andrew P.; Carrillo, Ismael M.; Jin, Xin

    This document is the final report of a two-year development, test, and demonstration project, 'Cohesive Application of Standards- Based Connected Devices to Enable Clean Energy Technologies.' The project was part of the National Renewable Energy Laboratory's (NREL's) Integrated Network Testbed for Energy Grid Research and Technology (INTEGRATE) initiative hosted at Energy Systems Integration Facility (ESIF). This project demonstrated techniques to control distribution grid events using the coordination of traditional distribution grid devices and high-penetration renewable resources and demand response. Using standard communication protocols and semantic standards, the project examined the use cases of high/low distribution voltage, requests for volt-ampere-reactive (VAR)more » power support, and transactive energy strategies using Volttron. Open source software, written by EPRI to control distributed energy resources (DER) and demand response (DR), was used by an advanced distribution management system (ADMS) to abstract the resources reporting to a collection of capabilities rather than needing to know specific resource types. This architecture allows for scaling both horizontally and vertically. Several new technologies were developed and tested. Messages from the ADMS based on the common information model (CIM) were developed to control the DER and DR management systems. The OpenADR standard was used to help manage grid events by turning loads off and on. Volttron technology was used to simulate a homeowner choosing the price at which to enter the demand response market. Finally, the ADMS used newly developed algorithms to coordinate these resources with a capacitor bank and voltage regulator to respond to grid events.« less

  7. Clean birth and postnatal care practices to reduce neonatal deaths from sepsis and tetanus: a systematic review and Delphi estimation of mortality effect

    PubMed Central

    2011-01-01

    Background Annually over 520,000 newborns die from neonatal sepsis, and 60,000 more from tetanus. Estimates of the effect of clean birth and postnatal care practices are required for evidence-based program planning. Objective To review the evidence for clean birth and postnatal care practices and estimate the effect on neonatal mortality from sepsis and tetanus for the Lives Saved Tool (LiST). Methods We conducted a systematic review of multiple databases. Data were abstracted into standard tables and assessed by GRADE criteria. Where appropriate, meta-analyses were undertaken. For interventions with low quality evidence but a strong GRADE recommendation, a Delphi process was conducted. Results Low quality evidence supports a reduction in all-cause neonatal mortality (19% (95% c.i. 1–34%)), cord infection (30% (95% c.i. 20–39%)) and neonatal tetanus (49% (95% c.i. 35–62%)) with birth attendant handwashing. Very low quality evidence supports a reduction in neonatal tetanus mortality with a clean birth surface (93% (95% c.i. 77-100%)) and no relationship between a clean perineum and tetanus. Low quality evidence supports a reduction of neonatal tetanus with facility birth (68% (95% c.i. 47-88%). No relationship was found between birth place and cord infections or sepsis mortality. For postnatal clean practices, all-cause mortality is reduced with chlorhexidine cord applications in the first 24 hours of life (34% (95% c.i. 5–54%, moderate quality evidence) and antimicrobial cord applications (63% (95% c.i. 41–86%, low quality evidence). One study of postnatal maternal handwashing reported reductions in all-cause mortality (44% (95% c.i. 18–62%)) and cord infection ((24% (95% c.i. 5-40%)). Given the low quality of evidence, a Delphi expert opinion process was undertaken. Thirty experts reached consensus regarding reduction of neonatal sepsis deaths by clean birth practices at home (15% (IQR 10–20)) or in a facility (27% IQR 24–36)), and by clean postnatal care practices (40% (IQR 25–50)). The panel estimated that neonatal tetanus mortality was reduced by clean birth practices at home (30% (IQR(20–30)), or in a facility (38% (IQR 34–40)), and by clean postnatal care practices (40% (IQR 30–50)). Conclusion According to expert opinion, clean birth and particularly postnatal care practices are effective in reducing neonatal mortality from sepsis and tetanus. Further research is required regarding optimal implementation strategies. PMID:21501428

  8. Bioinspired Multifunctional Paper-Based rGO Composites for Solar-Driven Clean Water Generation.

    PubMed

    Lou, Jinwei; Liu, Yang; Wang, Zhongyong; Zhao, Dengwu; Song, Chengyi; Wu, Jianbo; Dasgupta, Neil; Zhang, Wang; Zhang, Di; Tao, Peng; Shang, Wen; Deng, Tao

    2016-06-15

    Reusing polluted water through various decontamination techniques has appeared as one of the most practical approaches to address the global shortage of clean water. Rather than relying on single decontamination mechanism, herein we report the preparation and utilization of paper-based composites for multifunctional solar-driven clean water generation that is inspired by the multiple water purification approaches in biological systems. The reduced graphene oxide (rGO) sheets within such composites can efficiently remove organic contaminants through physical adsorption mechanism. Under solar irradiation, the floating rGO composites can instantly generate localized heating, which not only can directly generate clean water through distillation mechanism but also significantly enhance adsorption removal performance with the assistance of upward vapor flow. Such porous-structured paper-based composites allow for facile incorporation of photocatalysts to regenerate clean water out of contaminated water with combined adsorption, photodegradation, and interfacial heat-assisted distillation mechanisms. Within a homemade all-in-one water treatment device, the practical applicability of the composites for multifunctional clean water generation has been demonstrated.

  9. 40 CFR 52.2270 - Identification of plan.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... incorporated as it exists on the date of the approval, and notice of any change in the material will be... Clean School Bus Program Section 114.640 Definitions 9/20/2006 4/9/2010, 75 FR 18061 Section 114.642 Applicability 9/20/2006 4/9/2010, 75 FR 18061 Section 114.644 Clean School Bus Program Requirements 9/20/2006 4...

  10. 76 FR 34872 - Approval and Promulgation of Implementation Plans; State of California; Regional Haze and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-15

    ... that they have either received TAS or completed the application process for TAS under the Clean Water.... Similarly, we also do not need to address the Tribes' comment regarding TAS under the Clean Water Act as...-polluting fuels or use advanced control technology to reduce emissions of NO X (CAA section 182(e)(3)). \\14...

  11. Developing an Approach to Harvesting, Cleaning, and Analyzing Data from Twitter Using R

    ERIC Educational Resources Information Center

    Hill, Stephen; Scott, Rebecca

    2017-01-01

    Using data from social media can be of great value to businesses and other interested parties. However, harvesting data from social media networks such as Twitter, cleaning the data, and analyzing the data can be difficult. In this article, a step-by-step approach to obtaining data via the Twitter application program interface (API) is described.…

  12. Metal sponge for cryosorption pumping applications

    DOEpatents

    Myneni, G.R.; Kneisel, P.

    1995-12-26

    A system has been developed for adsorbing gases at high vacuum in a closed area. The system utilizes large surface clean anodized metal surfaces at low temperatures to adsorb the gases. The large surface clean anodized metal is referred to as a metal sponge. The metal sponge generates or maintains the high vacuum by increasing the available active cryosorbing surface area. 4 figs.

  13. Clean Transfer of Large Graphene Single Crystals for High-Intactness Suspended Membranes and Liquid Cells.

    PubMed

    Zhang, Jincan; Lin, Li; Sun, Luzhao; Huang, Yucheng; Koh, Ai Leen; Dang, Wenhui; Yin, Jianbo; Wang, Mingzhan; Tan, Congwei; Li, Tianran; Tan, Zhenjun; Liu, Zhongfan; Peng, Hailin

    2017-07-01

    The atomically thin 2D nature of suspended graphene membranes holds promising in numerous technological applications. In particular, the outstanding transparency to electron beam endows graphene membranes great potential as a candidate for specimen support of transmission electron microscopy (TEM). However, major hurdles remain to be addressed to acquire an ultraclean, high-intactness, and defect-free suspended graphene membrane. Here, a polymer-free clean transfer of sub-centimeter-sized graphene single crystals onto TEM grids to fabricate large-area and high-quality suspended graphene membranes has been achieved. Through the control of interfacial force during the transfer, the intactness of large-area graphene membranes can be as high as 95%, prominently larger than reported values in previous works. Graphene liquid cells are readily prepared by π-π stacking two clean single-crystal graphene TEM grids, in which atomic-scale resolution imaging and temporal evolution of colloid Au nanoparticles are recorded. This facile and scalable production of clean and high-quality suspended graphene membrane is promising toward their wide applications for electron and optical microscopy. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Surface cleaning techniques and efficient B-field profiles for lithium ion sources on extraction ion diodes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuneo, M.E.; Menge, P.R.; Hanson, D.L.

    Application of ion beams to Inertial Confinement Fusion requires efficient production, transport and focusing of an intense, low microdivergence beam of an appropriate range ion. At Sandia, the authors are studying the production of lithium ion beams in extraction applied-B ion diodes on the SABRE accelerator (5 MV, 250 kA). Evidence on both SABRE (1 TW) and PBFA-II (20 TW) indicates that the lithium beam turns off and is replaced by a beam of mostly protons and carbon, possibly due to electron thermal and stimulated desorption of hydrocarbon surface contamination with subsequent avalanche ionization. Turn-off of the lithium beam ismore » accompanied by rapid impedance collapse. Surface cleaning techniques are being developed to reduce beam contamination, increase the total lithium energy and reduce the rate of diode impedance collapse. Application of surface cleaning techniques has increased the production of lithium from passive LiF sources by a factor of 2. Improved diode electric and magnetic field profiles have increased the diode efficiency and production of lithium by a factor of 5, without surface cleaning. Work is ongoing to combine these two advances which are discussed here.« less

  15. Fate and aqueous transport of mercury in light of the Clean Air Mercury Rule for coal-fired electric power plants

    NASA Astrophysics Data System (ADS)

    Arzuman, Anry

    Mercury is a hazardous air pollutant emitted to the atmosphere in large amounts. Mercury emissions from electric power generation sources were estimated to be 48 metric tons/year, constituting the single largest anthropogenic source of mercury in the U.S. Settled mercury species are highly toxic contaminants of the environment. The newly issued Federal Clean Air Mercury Rule requires that the electric power plants firing coal meet the new Maximum Achievable Mercury Control Technology limit by 2018. This signifies that all of the air-phase mercury will be concentrated in solid phase which, based on the current state of the Air Pollution Control Technology, will be fly ash. Fly ash is utilized by different industries including construction industry in concrete, its products, road bases, structural fills, monifills, for solidification, stabilization, etc. Since the increase in coal combustion in the U.S. (1.6 percent/year) is much higher than the fly ash demand, large amounts of fly ash containing mercury and other trace elements are expected to accumulate in the next decades. The amount of mercury transferred from one phase to another is not a linear function of coal combustion or ash production, depends on the future states of technology, and is unknown. The amount of aqueous mercury as a function of the future removal, mercury speciation, and coal and aquifer characteristics is also unknown. This paper makes a first attempt to relate mercury concentrations in coal, flue gas, fly ash, and fly ash leachate using a single algorithm. Mercury concentrations in all phases were examined and phase transformation algorithms were derived in a form suitable for probabilistic analyses. Such important parameters used in the transformation algorithms as Soil Cation Exchange Capacity for mercury, soil mercury selectivity sequence, mercury activity coefficient, mercury retardation factor, mercury species soil adsorption ratio, and mercury Freundlich soil adsorption isotherm coefficients were derived. Mercury air-phase removal efficiency was studied as a function of dominant mercury species vapor pressures, the amount of chlorine, sorbent injection rate and adsorption capacities, and process temperature and modifications. A mercury air phase removal algorithm was derived which defines the future removal efficiencies as a function of activated carbon injection rate. Mercury adsorption on soil was studied as a function of Mercury Mass Law incorporating the dominant aquatic mercury species, pH, chlorine and sulfur concentrations, and the amount of complexed hydroxyl groups. Aquatic mercury longitudinal plume delineation was studied using the Domenico and Robbins function. A Monte Carlo simulation was performed using random number series (5000) for all of the variables in the Domenico and Robbins and mercury retardation functions. The probability that the Maximum Contaminant Level for mercury will be exceeded was found to be equal approximately 1 percent of all soil-related fly ash applications.

  16. Sceening, down selection, and implementation of environmentally compliant cleaning and insulation bonding for MNASA

    NASA Astrophysics Data System (ADS)

    Keen, Jill M.; Hutchens, D. E.; Smith, G. M.; Dillard, T. W.

    1994-06-01

    MNASA, a quarter-scale space shuttle solid rocket motor, has historically been processed using environmentally and physiologically harmful chemicals. This program draws from previous testing done in support of full-scale manufacturing and examines the synergy and interdependency between environmentally acceptable materials for Solid Rocket Motor insulation applications, bonding, corrosion inhibiting, painting, priming and cleaning; and then implements new materials and processes in sub-scale motors. Tests have been conducted to eliminate or minimize hazardous chemicals used in the manufacture of MNASA components and identify alternate materials and/or processes following NASA Operational Environment Team (NOET) priorities. This presentation describes implementation of high pressure water refurbishment cleaning, aqueous precision cleaning using both Brulin 815 GD and Jettacin and insulation case bonding using ODC compliant primers and adhesives.

  17. Underwater cleaning techniqued used for removal of zebra mussels at the FitzPatrick Nuclear Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hobbs, B.; Kahabka, J.

    1995-06-01

    This paper discusses the use of a mechanical brush cleaning technology recently used to remove biofouling from the Circulating Water (CW) System at New York Power Authority`s James A. FitzPatrick Nuclear Power Plant. The FitzPatrick plant had previously used chemical molluscicide to treat zebra mussels in the CW system. Full system treatment was performed in 1992 with limited forebay/screenwell treatment in 1993. The New York Power Authority (NYPA) decided to conduct a mechanical cleaning of the intake system in 1994. Specific project objectives included: (1) Achieve a level of surface cleaniness greater than 98%; (2) Remove 100% of debris, bothmore » existing sediment and debris generated as a result of cleaning; (3) Inspect all surfaces and components, identifying any problem areas; (4) Complete the task in a time frame within the 1994-95 refueling outage schedule window, and; (5) Determine if underwater mechanical cleaning is a cost-effective zebra mussel control method suitable for future application at FitzPatrick. A pre-cleaning inspection, including underwater video photography, was conducted of each area. Cleaning was accomplished using diver-controlled, multi-brush equipment included the electro-hydraulic powered Submersible Cleaning and Maintenance Platform (SCAMP), and several designs of hand-held machines. The brushes swept all zebra mussels off surfaces, restoring concrete and metal substrates to their original condition. Sensitive areas including pump housings, standpipes, sensor piping and chlorine injection tubing, were cleaned without degradation. Submersible vortex vacuum pumps were used to remove debris from the cavity. More than 46,000 ft{sup 2} of surface area was cleaned and over 460 cubic yards of dewatered debris were removed. As each area was completed, a post-clean inspection with photos and video was performed.« less

  18. Potential Nano-Enabled Environmental Applications for Radionuclides

    EPA Pesticide Factsheets

    This document provides information about nanotechnology materials and processes that may be applicable when cleaning up radioactively contaminated sites or materials, and presents a snapshot of lessons learned in nano-science and engineering.

  19. Staying sticky: contact self-cleaning of gecko-inspired adhesives.

    PubMed

    Mengüç, Yigit; Röhrig, Michael; Abusomwan, Uyiosa; Hölscher, Hendrik; Sitti, Metin

    2014-05-06

    The exceptionally adhesive foot of the gecko remains clean in dirty environments by shedding contaminants with each step. Synthetic gecko-inspired adhesives have achieved similar attachment strengths to the gecko on smooth surfaces, but the process of contact self-cleaning has yet to be effectively demonstrated. Here, we present the first gecko-inspired adhesive that has matched both the attachment strength and the contact self-cleaning performance of the gecko's foot on a smooth surface. Contact self-cleaning experiments were performed with three different sizes of mushroom-shaped elastomer microfibres and five different sizes of spherical silica contaminants. Using a load-drag-unload dry contact cleaning process similar to the loads acting on the gecko foot during locomotion, our fully contaminated synthetic gecko adhesives could recover lost adhesion at a rate comparable to that of the gecko. We observed that the relative size of contaminants to the characteristic size of the microfibres in the synthetic adhesive strongly determined how and to what degree the adhesive recovered from contamination. Our approximate model and experimental results show that the dominant mechanism of contact self-cleaning is particle rolling during the drag process. Embedding of particles between adjacent fibres was observed for particles with diameter smaller than the fibre tips, and further studied as a temporary cleaning mechanism. By incorporating contact self-cleaning capabilities, real-world applications of synthetic gecko adhesives, such as reusable tapes, clothing closures and medical adhesives, would become feasible.

  20. Staying sticky: contact self-cleaning of gecko-inspired adhesives

    PubMed Central

    Mengüç, Yiğit; Röhrig, Michael; Abusomwan, Uyiosa; Hölscher, Hendrik; Sitti, Metin

    2014-01-01

    The exceptionally adhesive foot of the gecko remains clean in dirty environments by shedding contaminants with each step. Synthetic gecko-inspired adhesives have achieved similar attachment strengths to the gecko on smooth surfaces, but the process of contact self-cleaning has yet to be effectively demonstrated. Here, we present the first gecko-inspired adhesive that has matched both the attachment strength and the contact self-cleaning performance of the gecko's foot on a smooth surface. Contact self-cleaning experiments were performed with three different sizes of mushroom-shaped elastomer microfibres and five different sizes of spherical silica contaminants. Using a load–drag–unload dry contact cleaning process similar to the loads acting on the gecko foot during locomotion, our fully contaminated synthetic gecko adhesives could recover lost adhesion at a rate comparable to that of the gecko. We observed that the relative size of contaminants to the characteristic size of the microfibres in the synthetic adhesive strongly determined how and to what degree the adhesive recovered from contamination. Our approximate model and experimental results show that the dominant mechanism of contact self-cleaning is particle rolling during the drag process. Embedding of particles between adjacent fibres was observed for particles with diameter smaller than the fibre tips, and further studied as a temporary cleaning mechanism. By incorporating contact self-cleaning capabilities, real-world applications of synthetic gecko adhesives, such as reusable tapes, clothing closures and medical adhesives, would become feasible. PMID:24554579

  1. A new BP Fourier algorithm and its application in English teaching evaluation

    NASA Astrophysics Data System (ADS)

    Pei, Xuehui; Pei, Guixin

    2017-08-01

    BP neural network algorithm has wide adaptability and accuracy when used in complicated system evaluation, but its calculation defects such as slow convergence have limited its practical application. The paper tries to speed up the calculation convergence of BP neural network algorithm with Fourier basis functions and presents a new BP Fourier algorithm for complicated system evaluation. First, shortages and working principle of BP algorithm are analyzed for subsequent targeted improvement; Second, the presented BP Fourier algorithm adopts Fourier basis functions to simplify calculation structure, designs new calculation transfer function between input and output layers, and conducts theoretical analysis to prove the efficiency of the presented algorithm; Finally, the presented algorithm is used in evaluating university English teaching and the application results shows that the presented BP Fourier algorithm has better performance in calculation efficiency and evaluation accuracy and can be used in evaluating complicated system practically.

  2. Development of Commercial Thermo-sensitive Genic Male Sterile Rice Accelerates Hybrid Rice Breeding Using the CRISPR/Cas9-mediated TMS5 Editing System.

    PubMed

    Zhou, Hai; He, Ming; Li, Jing; Chen, Liang; Huang, Zhifeng; Zheng, Shaoyan; Zhu, Liya; Ni, Erdong; Jiang, Dagang; Zhao, Bingran; Zhuang, Chuxiong

    2016-11-22

    Hybrid rice breeding offers an important strategy to improve rice production, in which the cultivation of a male sterile line is the key to the success of cross-breeding. CRISPR/Cas9 systems have been widely used in target-site genome editing, whereas their application for crop genetic improvement has been rarely reported. Here, using the CRISPR/Cas9 system, we induced specific mutations in TMS5, which is the most widely applied thermo-sensitive genic male sterility (TGMS) gene in China, and developed new "transgene clean" TGMS lines. We designed 10 target sites in the coding region of TMS5 for targeted mutagenesis using the CRISPR/Cas9 system and assessed the potential rates of on- and off-target effects. Finally, we established the most efficient construct, the TMS5ab construct, for breeding potentially applicable "transgene clean" TGMS lines. We also discussed factors that affect the editing efficiency according to the characteristics of different target sequences. Notably, using the TMS5ab construct, we developed 11 new "transgene clean" TGMS lines with potential applications in hybrid breeding within only one year in both rice subspecies. The application of our system not only significantly accelerates the breeding of sterile lines but also facilitates the exploitation of heterosis.

  3. Layout Study and Application of Mobile App Recommendation Approach Based On Spark Streaming Framework

    NASA Astrophysics Data System (ADS)

    Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.

    2018-05-01

    For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.

  4. Determination of the clean-up efficiency of the solid-phase extraction of rosemary extracts: Application of full-factorial design in hyphenation with Gaussian peak fit function.

    PubMed

    Meischl, Florian; Kirchler, Christian Günter; Jäger, Michael Andreas; Huck, Christian Wolfgang; Rainer, Matthias

    2018-02-01

    We present a novel method for the quantitative determination of the clean-up efficiency to provide a calculated parameter for peak purity through iterative fitting in conjunction with design of experiments. Rosemary extracts were used and analyzed before and after solid-phase extraction using a self-fabricated mixed-mode sorbent based on poly(N-vinylimidazole/ethylene glycol dimethacrylate). Optimization was performed by variation of washing steps using a full three-level factorial design and response surface methodology. Separation efficiency of rosmarinic acid from interfering compounds was calculated using an iterative fit of Gaussian-like signals and quantifications were performed by the separate integration of the two interfering peak areas. Results and recoveries were analyzed using Design-Expert® software and revealed significant differences between the washing steps. Optimized parameters were considered and used for all further experiments. Furthermore, the solid-phase extraction procedure was tested and compared with commercial available sorbents. In contrast to generic protocols of the manufacturers, the optimized procedure showed excellent recoveries and clean-up rates for the polymer with ion exchange properties. Finally, rosemary extracts from different manufacturing areas and application types were studied to verify the developed method for its applicability. The cleaned-up extracts were analyzed by liquid chromatography with tandem mass spectrometry for detailed compound evaluation to exclude any interference from coeluting molecules. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Electricity, water, and natural gas consumption of a residential house in Canada from 2012 to 2014.

    PubMed

    Makonin, Stephen; Ellert, Bradley; Bajić, Ivan V; Popowich, Fred

    2016-06-07

    With the cost of consuming resources increasing (both economically and ecologically), homeowners need to find ways to curb consumption. The Almanac of Minutely Power dataset Version 2 (AMPds2) has been released to help computational sustainability researchers, power and energy engineers, building scientists and technologists, utility companies, and eco-feedback researchers test their models, systems, algorithms, or prototypes on real house data. In the vast majority of cases, real-world datasets lead to more accurate models and algorithms. AMPds2 is the first dataset to capture all three main types of consumption (electricity, water, and natural gas) over a long period of time (2 years) and provide 11 measurement characteristics for electricity. No other such datasets from Canada exist. Each meter has 730 days of captured data. We also include environmental and utility billing data for cost analysis. AMPds2 data has been pre-cleaned to provide for consistent and comparable accuracy results amongst different researchers and machine learning algorithms.

  6. Vector network analyzer ferromagnetic resonance spectrometer with field differential detection

    NASA Astrophysics Data System (ADS)

    Tamaru, S.; Tsunegi, S.; Kubota, H.; Yuasa, S.

    2018-05-01

    This work presents a vector network analyzer ferromagnetic resonance (VNA-FMR) spectrometer with field differential detection. This technique differentiates the S-parameter by applying a small binary modulation field in addition to the DC bias field to the sample. By setting the modulation frequency sufficiently high, slow sensitivity fluctuations of the VNA, i.e., low-frequency components of the trace noise, which limit the signal-to-noise ratio of the conventional VNA-FMR spectrometer, can be effectively removed, resulting in a very clean FMR signal. This paper presents the details of the hardware implementation and measurement sequence as well as the data processing and analysis algorithms tailored for the FMR spectrum obtained with this technique. Because the VNA measures a complex S-parameter, it is possible to estimate the Gilbert damping parameter from the slope of the phase variation of the S-parameter with respect to the bias field. We show that this algorithm is more robust against noise than the conventional algorithm based on the linewidth.

  7. Evolutionary design of a generalized polynomial neural network for modelling sediment transport in clean pipes

    NASA Astrophysics Data System (ADS)

    Ebtehaj, Isa; Bonakdari, Hossein; Khoshbin, Fatemeh

    2016-10-01

    To determine the minimum velocity required to prevent sedimentation, six different models were proposed to estimate the densimetric Froude number (Fr). The dimensionless parameters of the models were applied along with a combination of the group method of data handling (GMDH) and the multi-target genetic algorithm. Therefore, an evolutionary design of the generalized GMDH was developed using a genetic algorithm with a specific coding scheme so as not to restrict connectivity configurations to abutting layers only. In addition, a new preserving mechanism by the multi-target genetic algorithm was utilized for the Pareto optimization of GMDH. The results indicated that the most accurate model was the one that used the volumetric concentration of sediment (CV), relative hydraulic radius (d/R), dimensionless particle number (Dgr) and overall sediment friction factor (λs) in estimating Fr. Furthermore, the comparison between the proposed method and traditional equations indicated that GMDH is more accurate than existing equations.

  8. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  9. Different types of maximum power point tracking techniques for renewable energy systems: A survey

    NASA Astrophysics Data System (ADS)

    Khan, Mohammad Junaid; Shukla, Praveen; Mustafa, Rashid; Chatterji, S.; Mathew, Lini

    2016-03-01

    Global demand for electricity is increasing while production of energy from fossil fuels is declining and therefore the obvious choice of the clean energy source that is abundant and could provide security for development future is energy from the sun. In this paper, the characteristic of the supply voltage of the photovoltaic generator is nonlinear and exhibits multiple peaks, including many local peaks and a global peak in non-uniform irradiance. To keep global peak, MPPT is the important component of photovoltaic systems. Although many review articles discussed conventional techniques such as P & O, incremental conductance, the correlation ripple control and very few attempts have been made with intelligent MPPT techniques. This document also discusses different algorithms based on fuzzy logic, Ant Colony Optimization, Genetic Algorithm, artificial neural networks, Particle Swarm Optimization Algorithm Firefly, Extremum seeking control method and hybrid methods applied to the monitoring of maximum value of power at point in systems of photovoltaic under changing conditions of irradiance.

  10. Electricity, water, and natural gas consumption of a residential house in Canada from 2012 to 2014

    PubMed Central

    Makonin, Stephen; Ellert, Bradley; Bajić, Ivan V.; Popowich, Fred

    2016-01-01

    With the cost of consuming resources increasing (both economically and ecologically), homeowners need to find ways to curb consumption. The Almanac of Minutely Power dataset Version 2 (AMPds2) has been released to help computational sustainability researchers, power and energy engineers, building scientists and technologists, utility companies, and eco-feedback researchers test their models, systems, algorithms, or prototypes on real house data. In the vast majority of cases, real-world datasets lead to more accurate models and algorithms. AMPds2 is the first dataset to capture all three main types of consumption (electricity, water, and natural gas) over a long period of time (2 years) and provide 11 measurement characteristics for electricity. No other such datasets from Canada exist. Each meter has 730 days of captured data. We also include environmental and utility billing data for cost analysis. AMPds2 data has been pre-cleaned to provide for consistent and comparable accuracy results amongst different researchers and machine learning algorithms. PMID:27271937

  11. Prediction of breast cancer risk with volatile biomarkers in breath.

    PubMed

    Phillips, Michael; Cataneo, Renee N; Cruz-Ramos, Jose Alfonso; Huston, Jan; Ornelas, Omar; Pappas, Nadine; Pathak, Sonali

    2018-03-23

    Human breath contains volatile organic compounds (VOCs) that are biomarkers of breast cancer. We investigated the positive and negative predictive values (PPV and NPV) of breath VOC biomarkers as indicators of breast cancer risk. We employed ultra-clean breath collection balloons to collect breath samples from 54 women with biopsy-proven breast cancer and 124 cancer-free controls. Breath VOCs were analyzed with gas chromatography (GC) combined with either mass spectrometry (GC MS) or surface acoustic wave detection (GC SAW). Chromatograms were randomly assigned to a training set or a validation set. Monte Carlo analysis identified significant breath VOC biomarkers of breast cancer in the training set, and these biomarkers were incorporated into a multivariate algorithm to predict disease in the validation set. In the unsplit dataset, the predictive algorithms generated discriminant function (DF) values that varied with sensitivity, specificity, PPV and NPV. Using GC MS, test accuracy = 90% (area under curve of receiver operating characteristic in unsplit dataset) and cross-validated accuracy = 77%. Using GC SAW, test accuracy = 86% and cross-validated accuracy = 74%. With both assays, a low DF value was associated with a low risk of breast cancer (NPV > 99.9%). A high DF value was associated with a high risk of breast cancer and PPV rising to 100%. Analysis of breath VOC samples collected with ultra-clean balloons detected biomarkers that accurately predicted risk of breast cancer.

  12. ODS - modified TiO2 nanoparticles for the preparation of self-cleaning superhydrophobic coating

    NASA Astrophysics Data System (ADS)

    Kokare, Ashvini M.; Sutar, Rajaram S.; Deshmukh, S. G.; Xing, Ruimin; Liu, Shanhu; Latthe, Sanjay S.

    2018-05-01

    Rolling water drops takes off dust particles from lotus leaf showing self-cleaning performance. Self-cleaning effect has great importance in industry as well as in daily life. The present paper describes the preparation of self-cleaning superhydrophobic coating through simple and low cost dip coating technique. The prepared superhydrophobic surface enact as lotus leaf. Firstly TiO2 nanoparticles were dispersed in ethanol and different concentration of octadecyltrichlorosilane (ODS) was added in TiO2 dispersion. The effect of number of deposition layer on the wettability of the coating was studied. The coating prepared from five deposition layers showed contact angle higher than 150° and sliding angle less than 10°. The superhydrophobicity increases with increasing concentration of ODS. The hierarchical rough morphology which is preferable for superhydrophobicity was obtained. The prepared coatings were stable against water jet impact and showed repellent towards colored and muddy water. Such superhydrophobic coating can find enormous scope in self-cleaning application.

  13. Innovative technologies on fuel assemblies cleaning for sodium fast reactors: First considerations on cleaning process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, N.; Lorcet, H.; Beauchamp, F.

    2012-07-01

    Within the framework of Sodium Fast Reactor development, innovative fuel assembly cleaning operations are investigated to meet the GEN IV goals of safety and of process development. One of the challenges is to mitigate the Sodium Water Reaction currently used in these processes. The potential applications of aqueous solutions of mineral salts (including the possibility of using redox chemical reactions) to mitigate the Sodium Water Reaction are considered in a first part and a new experimental bench, dedicated to this study, is described. Anhydrous alternative options based on Na/CO{sub 2} interaction are also presented. Then, in a second part, amore » functional study conducted on the cleaning pit is proposed. Based on experimental feedback, some calculations are carried out to estimate the sodium inventory on the fuel elements, and physical methods like hot inert gas sweeping to reduce this inventory are also presented. Finally, the implementation of these innovative solutions in cleaning pits is studied in regard to the expected performances. (authors)« less

  14. Super Clean, Super Safe

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Supersonic Gas/Liquid Cleaning System (SS-GLCS) has applications ranging from cleaning circuit boards to scouring building exteriors. The system does not abrade the surface of the hardware being cleaned, and it requires much lower levels of pressure while using very little water. An alternative to CFC-based solvents, the system mixes air and water from separate pressurized tanks, ejecting the gas- liquid mixture at supersonic speeds from a series of nozzles at the end of a hand-held wand. The water droplets have the kinetic energy to forcibly remove the contaminant material. The system leaves very little fluid that must be handled as contaminated waste. It can be applied in the aerospace, automotive, and medical industries, as well as to circuit boards, electronics, machinery, metals, plastics, and optics. With a nozzle that can be oriented in any direction, the system is adjustable to allow all sides of a part to be cleaned without reorientation. It requires minimal training and is easily moved on built-in casters

  15. 75 FR 56986 - Argonne National Laboratory, et al.; Notice of Decision on Applications for Duty-Free Entry of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... situ deposition onto clean surfaces. In- situ capacities allow the preparation of clean and well... evaluation of the properties to be studied in these experiments. The instrument also has in-situ preparation... viewed between 8:30 a.m. and 5 p.m. in Room 3720, U.S. Department of Commerce, 14th and Constitution Ave...

  16. An application of the discrete-time Toda lattice to the progressive algorithm by Lanczos and related problems

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshimasa; Sekido, Hiroto

    2018-04-01

    The finite or the semi-infinite discrete-time Toda lattice has many applications to various areas in applied mathematics. The purpose of this paper is to review how the Toda lattice appears in the Lanczos algorithm through the quotient-difference algorithm and its progressive form (pqd). Then a multistep progressive algorithm (MPA) for solving linear systems is presented. The extended Lanczos parameters can be given not by computing inner products of the extended Lanczos vectors but by using the pqd algorithm with highly relative accuracy in a lower cost. The asymptotic behavior of the pqd algorithm brings us some applications of MPA related to eigenvectors.

  17. Development and use of microbial-based cleaning products (MBCPs): Current issues and knowledge gaps.

    PubMed

    Arvanitakis, George; Temmerman, Robin; Spök, Armin

    2018-06-01

    Cleaning products containing microbes as active ingredients are becoming increasingly prevalent as an alternative to chemical-based cleaning products. These microbial-based cleaning products (MBCPs) are being used in domestic and commercial settings (i.e., households and businesses) and institutional settings (e.g., hospitals, schools, etc.), in a variety of cleaning activities (hard surface cleaning, odour control, degreasing, septic tank treatments, etc.). They are typically described as "environmentally friendly" and "non-toxic". Publicly available information sources (scientific literature, patent databases, commercial websites) were searched for information on microbial species contained in MBCPs, their mode of action, cleaning applications in which they are used, and their potential impacts on human health and the environment. Although information was found providing a broad indication of microbial genera/species used, information on specific species/strains and quantities produced and sold is generally lacking. This makes it difficult to conduct a meaningful examination of any risks to human health and the environment from the production and use of MBCPs and to determine how effective current policies and regulatory frameworks are in addressing these issues. These and other challenges were addressed at an international workshop in Ottawa, Canada in June 2013 by a number of stakeholders, including industry, government, academic and non-governmental organizations. Copyright © 2017. Published by Elsevier Ltd.

  18. Comparative study of pulsed laser cleaning applied to weathered marble surfaces

    NASA Astrophysics Data System (ADS)

    Ortiz, P.; Antúnez, V.; Ortiz, R.; Martín, J. M.; Gómez, M. A.; Hortal, A. R.; Martínez-Haya, B.

    2013-10-01

    The removal of unwanted matter from surface stones is a demanding task in the conservation of cultural heritage. This paper investigates the effectiveness of near-infrared (IR) and ultraviolet (UV) laser pulses for the cleaning of surface deposits, iron oxide stains and different types of graffiti (black, red and green sprays and markers, and black cutting-edge ink) on dolomitic white marble. The performance of the laser techniques is compared to common cleaning methods on the same samples, namely pressurized water and chemical treatments. The degree of cleaning achieved with each technique is assessed by means of colorimetric measurements and X-ray microfluorescence. Eventual morphological changes induced on the marble substrate are monitored with optical and electronic microscopy. It is found that UV pulsed laser ablation at 266 nm manages to clean all the stains except the cutting-edge ink, although some degree of surface erosion is produced. The IR laser pulses at 1064 nm can remove surface deposits and black spray acceptably, but a yellowing is observed on the stone surface after treatment. An economic evaluation shows that pulsed laser cleaning techniques are advantageous for the rapid cleaning of small or inaccessible surface areas, although their extensive application becomes expensive due to the long operating times required.

  19. YM500v2: a small RNA sequencing (smRNA-seq) database for human cancer miRNome research.

    PubMed

    Cheng, Wei-Chung; Chung, I-Fang; Tsai, Cheng-Fong; Huang, Tse-Shun; Chen, Chen-Yang; Wang, Shao-Chuan; Chang, Ting-Yu; Sun, Hsing-Jen; Chao, Jeffrey Yung-Chuan; Cheng, Cheng-Chung; Wu, Cheng-Wen; Wang, Hsei-Wei

    2015-01-01

    We previously presented YM500, which is an integrated database for miRNA quantification, isomiR identification, arm switching discovery and novel miRNA prediction from 468 human smRNA-seq datasets. Here in this updated YM500v2 database (http://ngs.ym.edu.tw/ym500/), we focus on the cancer miRNome to make the database more disease-orientated. New miRNA-related algorithms developed after YM500 were included in YM500v2, and, more significantly, more than 8000 cancer-related smRNA-seq datasets (including those of primary tumors, paired normal tissues, PBMC, recurrent tumors, and metastatic tumors) were incorporated into YM500v2. Novel miRNAs (miRNAs not included in the miRBase R21) were not only predicted by three independent algorithms but also cleaned by a new in silico filtration strategy and validated by wetlab data such as Cross-Linked ImmunoPrecipitation sequencing (CLIP-seq) to reduce the false-positive rate. A new function 'Meta-analysis' is additionally provided for allowing users to identify real-time differentially expressed miRNAs and arm-switching events according to customer-defined sample groups and dozens of clinical criteria tidying up by proficient clinicians. Cancer miRNAs identified hold the potential for both basic research and biotech applications. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Evaluation of Driver Visibility from Mobile LIDAR Data and Weather Conditions

    NASA Astrophysics Data System (ADS)

    González-Jorge, H.; Díaz-Vilariño, L.; Lorenzo, H.; Arias, P.

    2016-06-01

    Visibility of drivers is crucial to ensure road safety. Visibility is influenced by two main factors, the geometry of the road and the weather present therein. The present work depicts an approach for automatic visibility evaluation using mobile LiDAR data and climate information provided from weather stations located in the neighbourhood of the road. The methodology is based on a ray-tracing algorithm to detect occlusions from point clouds with the purpose of identifying the visibility area from each driver position. The resulting data are normalized with the climate information to provide a polyline with an accurate area of visibility. Visibility ranges from 25 m (heavy fog) to more than 10,000 m (clean atmosphere). Values over 250 m are not taken into account for road safety purposes, since this value corresponds to the maximum braking distance of a vehicle. Two case studies are evaluated an urban road in the city of Vigo (Spain) and an inter-urban road between the city of Ourense and the village of Castro Caldelas (Spain). In both cases, data from the Galician Weather Agency (Meteogalicia) are used. The algorithm shows promising results allowing the detection of particularly dangerous areas from the viewpoint of driver visibility. The mountain road between Ourense and Castro Caldelas, with great presence of slopes and sharp curves, shows special interest for this type of application. In this case, poor visibility can especially contribute to the run over of pedestrians or cyclists traveling on the road shoulders.

  1. Application of integration algorithms in a parallel processing environment for the simulation of jet engines

    NASA Technical Reports Server (NTRS)

    Krosel, S. M.; Milner, E. J.

    1982-01-01

    The application of Predictor corrector integration algorithms developed for the digital parallel processing environment are investigated. The algorithms are implemented and evaluated through the use of a software simulator which provides an approximate representation of the parallel processing hardware. Test cases which focus on the use of the algorithms are presented and a specific application using a linear model of a turbofan engine is considered. Results are presented showing the effects of integration step size and the number of processors on simulation accuracy. Real time performance, interprocessor communication, and algorithm startup are also discussed.

  2. Decision tree and ensemble learning algorithms with their applications in bioinformatics.

    PubMed

    Che, Dongsheng; Liu, Qi; Rasheed, Khaled; Tao, Xiuping

    2011-01-01

    Machine learning approaches have wide applications in bioinformatics, and decision tree is one of the successful approaches applied in this field. In this chapter, we briefly review decision tree and related ensemble algorithms and show the successful applications of such approaches on solving biological problems. We hope that by learning the algorithms of decision trees and ensemble classifiers, biologists can get the basic ideas of how machine learning algorithms work. On the other hand, by being exposed to the applications of decision trees and ensemble algorithms in bioinformatics, computer scientists can get better ideas of which bioinformatics topics they may work on in their future research directions. We aim to provide a platform to bridge the gap between biologists and computer scientists.

  3. Experimental study of fouling and cleaning of sintered stainless steel membrane in electro-microfiltration of calcium salt particles.

    PubMed

    Qin, Frank G F; Mawson, John; Zeng, Xin An

    2011-05-30

    Sintered stainless steel (SSS) microfiltration membranes, which served as electrode directly, were used for the experiment of separating Alamin, a calcium salt and protein containing particles, found in dairy processing. Fouling and cleaning of the SSS membranes under the application of an external electric field were studied. The imposed electric field was found, diverging the pH of permeate and retentate. This in turn altered the solubility of the calcium salt and impacted the performance of electro microfiltration membrane. Using electric field as an enhanced cleaning-in-place (CIP) method in back flushing SSS membrane was also studied.

  4. Experimental Study of Fouling and Cleaning of Sintered Stainless Steel Membrane in Electro-Microfiltration of Calcium Salt Particles

    PubMed Central

    Qin, Frank G. F.; Mawson, John; Zeng, Xin An

    2011-01-01

    Sintered stainless steel (SSS) microfiltration membranes, which served as electrode directly, were used for the experiment of separating Alamin, a calcium salt and protein containing particles, found in dairy processing. Fouling and cleaning of the SSS membranes under the application of an external electric field were studied. The imposed electric field was found, diverging the pH of permeate and retentate. This in turn altered the solubility of the calcium salt and impacted the performance of electro microfiltration membrane. Using electric field as an enhanced cleaning-in-place (CIP) method in back flushing SSS membrane was also studied. PMID:24957615

  5. Case study of a floor-cleaning robot

    NASA Astrophysics Data System (ADS)

    Branch, Allan C.

    1998-01-01

    Developing the technologies suitable of ra high level robotic application such as cleaning a floor has proved extremely difficult. Developing the robot mobility technology has been a stumbling block and developing and integrating the applications technology to the machine and the mobility technology has also been a difficult stage in this quest, but doing so in a cost effective and realistic manner suitable for the market place and to compete with humans and manually operated machines has been the most difficult of all. This paper describes one of these quests spanning a 14 year period and resulting in what is hoped will be the world's first commercially manufactured household robot vacuum cleaner.

  6. High throughput determination of cleaning solutions to prevent the fouling of an anion exchange resin.

    PubMed

    Elich, Thomas; Iskra, Timothy; Daniels, William; Morrison, Christopher J

    2016-06-01

    Effective cleaning of chromatography resin is required to prevent fouling and maximize the number of processing cycles which can be achieved. Optimization of resin cleaning procedures, however, can lead to prohibitive material, labor, and time requirements, even when using milliliter scale chromatography columns. In this work, high throughput (HT) techniques were used to evaluate cleaning agents for a monoclonal antibody (mAb) polishing step utilizing Fractogel(®) EMD TMAE HiCap (M) anion exchange (AEX) resin. For this particular mAb feed stream, the AEX resin could not be fully restored with traditional NaCl and NaOH cleaning solutions, resulting in a loss of impurity capacity with resin cycling. Miniaturized microliter scale chromatography columns and an automated liquid handling system (LHS) were employed to evaluate various experimental cleaning conditions. Cleaning agents were monitored for their ability to maintain resin impurity capacity over multiple processing cycles by analyzing the flowthrough material for turbidity and high molecular weight (HMW) content. HT experiments indicated that a 167 mM acetic acid strip solution followed by a 0.5 M NaOH, 2 M NaCl sanitization provided approximately 90% cleaning improvement over solutions containing solely NaCl and/or NaOH. Results from the microliter scale HT experiments were confirmed in subsequent evaluations at the milliliter scale. These results identify cleaning agents which may restore resin performance for applications involving fouling species in ion exchange systems. In addition, this work demonstrates the use of miniaturized columns operated with an automated LHS for HT evaluation of chromatographic cleaning procedures, effectively decreasing material requirements while simultaneously increasing throughput. Biotechnol. Bioeng. 2016;113: 1251-1259. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  7. Cleaning of parts for new manufacturing and parts rebuilding

    NASA Astrophysics Data System (ADS)

    Doherty, Jeff

    1994-06-01

    Parts cleaning is the largest single expense, and the most time consuming activity, in rebuilding and new manufacturing. On average, 25% to 40% of the total labor and overhead burden is spent on cleaning. EPA and OSHA pressures add to the burden by making some methods and chemicals obsolete. Some of the processes and chemicals in current use will be curtailed and or outlawed in the future. How can a shops and industries make long term decisions or capital investments in cleaning and process improvements when the government keeps changing its rules? At the MART Corporation in Saint Louis, Missouri, we manufacture a line of cabinet-style batch cleaning machines known as Power Washers. Twenty years ago MART invented and patented the Power Washer process, a cleaning method that recycles wash solution and blasts contaminates as they are washed off the more heavily contaminated parts. Since the initial invention MART has continued to R&D the washing process and develop ancillary systems that comply with EPA and OSHA regulations. For applications involving new industrial parts or items requiring specification cleaned surfaces. MART provides filtration and solution conditioning systems, part drying operations, and triple rinsing. Units are available in stainless steel or higher alloys. We are not alone in the washer manufacturing business. You have many choices of cleaning solutions (no pun intended) which will perform in your operations and yield good results. As a manufacturer, we are interested in your success with our equipment. We have all heard the horror stories of companies having selected inappropriate cleaning systems and or processes which then brought the company to its knees, production wise. Assembly, appearance, warranty, and performance shortcomings of finished products can often be directly related to the cleaning process and its shortcomings.

  8. Recent Progress in Fabrication and Applications of Superhydrophobic Coating on Cellulose-Based Substrates

    PubMed Central

    Liu, Hui; Gao, Shou-Wei; Cai, Jing-Sheng; He, Cheng-Lin; Mao, Jia-Jun; Zhu, Tian-Xue; Chen, Zhong; Huang, Jian-Ying; Meng, Kai; Zhang, Ke-Qin; Al-Deyab, Salem S.; Lai, Yue-Kun

    2016-01-01

    Multifuntional fabrics with special wettability have attracted a lot of interest in both fundamental research and industry applications over the last two decades. In this review, recent progress of various kinds of approaches and strategies to construct super-antiwetting coating on cellulose-based substrates (fabrics and paper) has been discussed in detail. We focus on the significant applications related to artificial superhydrophobic fabrics with special wettability and controllable adhesion, e.g., oil-water separation, self-cleaning, asymmetric/anisotropic wetting for microfluidic manipulation, air/liquid directional gating, and micro-template for patterning. In addition to the anti-wetting properties and promising applications, particular attention is paid to coating durability and other incorporated functionalities, e.g., air permeability, UV-shielding, photocatalytic self-cleaning, self-healing and patterned antiwetting properties. Finally, the existing difficulties and future prospects of this traditional and developing field are briefly proposed and discussed. PMID:28773253

  9. Non-aqueous cleaning solvent substitution

    NASA Technical Reports Server (NTRS)

    Meier, Gerald J.

    1994-01-01

    A variety of environmental, safety, and health concerns exist over use of chlorinated and fluorinated cleaning solvents. Sandia National Laboratories, Lawrence Livermore National Laboratories, and the Kansas City Division of AlliedSignal have combined efforts to focus on finding alternative cleaning solvents and processes which are effective, environmentally safe, and compliant with local, state, and federal regulations. An alternative solvent has been identified, qualified, and implemented into production of complex electronic assemblies, where aqueous and semi-aqueous cleaning processes are not allowed. Extensive compatibility studies were performed with components, piece-parts, and materials. Electrical testing and accelerated aging were used to screen for detrimental, long-term effects. A terpene, d-limonene, was selected as the solvent of choice, and it was found to be compatible with the components and materials tested. A brief history of the overall project will be presented, along with representative cleaning efficiency results, compatibility results, and residual solvent data. The electronics industry is constantly searching for proven methods and environmentally-safe materials to use in manufacturing processes. The information in this presentation will provide another option to consider on future projects for applications requiring high levels of quality, reliability, and cleanliness from non-aqueous cleaning processes.

  10. Surface oxidation of GaN(0001): Nitrogen plasma-assisted cleaning for ultrahigh vacuum applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gangopadhyay, Subhashis; Schmidt, Thomas, E-mail: tschmidt@ifp.uni-bremen.de; Kruse, Carsten

    The cleaning of metal-organic vapor-phase epitaxial GaN(0001) template layers grown on sapphire has been investigated. Different procedures, performed under ultrahigh vacuum conditions, including degassing and exposure to active nitrogen from a radio frequency nitrogen plasma source have been compared. For this purpose, x-ray photoelectron spectroscopy, reflection high-energy electron diffraction, and scanning tunneling microscopy have been employed in order to assess chemical as well as structural and morphological surface properties. Initial degassing at 600 °C under ultrahigh vacuum conditions only partially eliminates the surface contaminants. In contrast to plasma assisted nitrogen cleaning at temperatures as low as 300 °C, active-nitrogen exposure at temperaturesmore » as high as 700 °C removes the majority of oxide species from the surface. However, extended high-temperature active-nitrogen cleaning leads to severe surface roughening. Optimum results regarding both the removal of surface oxides as well as the surface structural and morphological quality have been achieved for a combination of initial low-temperature plasma-assisted cleaning, followed by a rapid nitrogen plasma-assisted cleaning at high temperature.« less

  11. In situ oxygen plasma cleaning of microswitch surfaces—comparison of Ti and graphite electrodes

    NASA Astrophysics Data System (ADS)

    Oh, Changho; Streller, Frank; Ashurst, W. Robert; Carpick, Robert W.; de Boer, Maarten P.

    2016-11-01

    Ohmic micro- and nanoswitches are of interest for a wide variety of applications including radio frequency communications and as low power complements to transistors. In these switches, it is of paramount importance to maintain surface cleanliness in order to prevent frequent failure by tribopolymer growth. To prepare surfaces, an oxygen plasma clean is expected to be beneficial compared to a high temperature vacuum bakeout because of shorter cleaning time (<5 min compared to ~24 h) and active removal of organic contaminants. We demonstrate that sputtering of the electrode material during oxygen plasma cleaning is a critical consideration for effective cleaning of switch surfaces. With Ti electrodes, a TiO x layer forms that increases electrical contact resistance. When plasma-cleaned using graphite electrodes, the resistance of Pt-coated microswitches exhibit a long lifetime with consistently low resistance (<0.5 Ω variation over 300 million cycles) if the test chamber is refilled with ultra-high purity nitrogen and if the devices are not exposed to laboratory air. Their current-voltage characteristic is also linear at the millivolt level. This is important for nanoswitches which will be operated in that range.

  12. Resistance of superhydrophobic and oleophobic surfaces to varied temperature applications on 316L SS

    NASA Astrophysics Data System (ADS)

    Shams, Hamza; Basit, Kanza; Saleem, Sajid; Siddiqui, Bilal A.

    316L SS also called Marine Stainless Steel is an important material for structural and marine applications. When superhydrophobic and oleophobic coatings are applied on 316L SS it shows significant resistance to wear and corrosion. This paper aims to validate the coatings manufacturer's information on optimal temperature range and test the viability of coating against multiple oil based cleaning agents. 316L SS was coated with multiple superhydrophic and oleohobic coatings and observed under SEM for validity of adhesion and thickness and then scanned under FFM to validate the tribological information. The samples were then dipped into multiple cleaning agents maintained at the range of operating temperatures specified by the manufacturer. Coating was observed for deterioration over a fixed time intervals through SEM and FFM. A comparison was drawn to validate the most critical cleaning agent and the most critical temperature at which the coating fails to leave the base substrate exposed to the environment.

  13. Cleaning and Cleanliness Measurement of Additive Manufactured Parts

    NASA Technical Reports Server (NTRS)

    Mitchell, Mark A.; Raley, Randy

    2016-01-01

    The successful acquisition and utilization of piece parts and assemblies for contamination sensitive applications requires application of cleanliness acceptance criteria. Contamination can be classified using many different schemes. One common scheme is classification as organic, ionic and particulate contaminants. These may be present in and on the surface of solid components and assemblies or may be dispersed in various gaseous or liquid media. This discussion will focus on insoluble particle contamination on the surfaces of piece parts and assemblies. Cleanliness of parts can be controlled using two strategies, referred to as gross cleanliness and precision cleanliness. Under a gross cleanliness strategy acceptance is based on visual cleanliness. This approach introduces a number of concerns that render it unsuitable for controlling cleanliness of high technology products. Under the precision cleanliness strategy, subjective, visual assessment of cleanliness is replaced by objective measurement of cleanliness. When a precision cleanliness strategy is adopted there naturally arises the question: How clean is clean enough? The methods for establishing objective cleanliness acceptance limits will be discussed.

  14. Raman enhancement on ultra-clean graphene quantum dots produced by quasi-equilibrium plasma-enhanced chemical vapor deposition.

    PubMed

    Liu, Donghua; Chen, Xiaosong; Hu, Yibin; Sun, Tai; Song, Zhibo; Zheng, Yujie; Cao, Yongbin; Cai, Zhi; Cao, Min; Peng, Lan; Huang, Yuli; Du, Lei; Yang, Wuli; Chen, Gang; Wei, Dapeng; Wee, Andrew Thye Shen; Wei, Dacheng

    2018-01-15

    Graphene is regarded as a potential surface-enhanced Raman spectroscopy (SERS) substrate. However, the application of graphene quantum dots (GQDs) has had limited success due to material quality. Here, we develop a quasi-equilibrium plasma-enhanced chemical vapor deposition method to produce high-quality ultra-clean GQDs with sizes down to 2 nm directly on SiO 2 /Si, which are used as SERS substrates. The enhancement factor, which depends on the GQD size, is higher than conventional graphene sheets with sensitivity down to 1 × 10 -9  mol L -1 rhodamine. This is attributed to the high-quality GQDs with atomically clean surfaces and large number of edges, as well as the enhanced charge transfer between molecules and GQDs with appropriate diameters due to the existence of Van Hove singularities in the electronic density of states. This work demonstrates a sensitive SERS substrate, and is valuable for applications of GQDs in graphene-based photonics and optoelectronics.

  15. Scalable Multifunctional Ultra-thin Graphite Sponge: Free-standing, Superporous, Superhydrophobic, Oleophilic Architecture with Ferromagnetic Properties for Environmental Cleaning

    PubMed Central

    Bay, Hamed Hosseini; Patino, Daisy; Mutlu, Zafer; Romero, Paige; Ozkan, Mihrimah; Ozkan, Cengiz S.

    2016-01-01

    Water decontamination and oil/water separation are principal motives in the surge to develop novel means for sustainability. In this prospect, supplying clean water for the ecosystems is as important as the recovery of the oil spills since the supplies are scarce. Inspired to design an engineering material which not only serves this purpose, but can also be altered for other applications to preserve natural resources, a facile template-free process is suggested to fabricate a superporous, superhydrophobic ultra-thin graphite sponge. Moreover, the process is designed to be inexpensive and scalable. The fabricated sponge can be used to clean up different types of oil, organic solvents, toxic and corrosive contaminants. This versatile microstructure can retain its functionality even when pulverized. The sponge is applicable for targeted sorption and collection due to its ferromagnetic properties. We hope that such a cost-effective process can be embraced and implemented widely. PMID:26908346

  16. Special Issue on Time Scale Algorithms

    DTIC Science & Technology

    2008-01-01

    are currently Two Way Satellite Time and Frequency Transfer ( TWSTFT ) and GPS carrier phase time transfer. The interest in time scale algorithms and...laboratory-specific innovations and practices, GNSS applications, UTC generation, TWSTFT applications, GPS applications, small-ensemble applications

  17. Level-1C Product from AIRS: Principal Component Filtering

    NASA Technical Reports Server (NTRS)

    Manning, Evan M.; Jiang, Yibo; Aumann, Hartmut H.; Elliott, Denis A.; Hannon, Scott

    2012-01-01

    The Atmospheric Infrared Sounder (AIRS), launched on the EOS Aqua spacecraft on May 4, 2002, is a grating spectrometer with 2378 channels in the range 3.7 to 15.4 microns. In a grating spectrometer each individual radiance measurement is largely independent of all others. Most measurements are extremely accurate and have very low noise levels. However, some channels exhibit high noise levels or other anomalous behavior, complicating applications needing radiances throughout a band, such as cross-calibration with other instruments and regression retrieval algorithms. The AIRS Level-1C product is similar to Level-1B but with instrument artifacts removed. This paper focuses on the "cleaning" portion of Level-1C, which identifies bad radiance values within spectra and produces substitute radiances using redundant information from other channels. The substitution is done in two passes, first with a simple combination of values from neighboring channels, then with principal components. After results of the substitution are shown, differences between principal component reconstructed values and observed radiances are used to investigate detailed noise characteristics and spatial misalignment in other channels.

  18. On the Impact of Localization and Density Control Algorithms in Target Tracking Applications for Wireless Sensor Networks

    PubMed Central

    Campos, Andre N.; Souza, Efren L.; Nakamura, Fabiola G.; Nakamura, Eduardo F.; Rodrigues, Joel J. P. C.

    2012-01-01

    Target tracking is an important application of wireless sensor networks. The networks' ability to locate and track an object is directed linked to the nodes' ability to locate themselves. Consequently, localization systems are essential for target tracking applications. In addition, sensor networks are often deployed in remote or hostile environments. Therefore, density control algorithms are used to increase network lifetime while maintaining its sensing capabilities. In this work, we analyze the impact of localization algorithms (RPE and DPE) and density control algorithms (GAF, A3 and OGDC) on target tracking applications. We adapt the density control algorithms to address the k-coverage problem. In addition, we analyze the impact of network density, residual integration with density control, and k-coverage on both target tracking accuracy and network lifetime. Our results show that DPE is a better choice for target tracking applications than RPE. Moreover, among the evaluated density control algorithms, OGDC is the best option among the three. Although the choice of the density control algorithm has little impact on the tracking precision, OGDC outperforms GAF and A3 in terms of tracking time. PMID:22969329

  19. Fast Demand Forecast of Electric Vehicle Charging Stations for Cell Phone Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majidpour, Mostafa; Qiu, Charlie; Chung, Ching-Yen

    This paper describes the core cellphone application algorithm which has been implemented for the prediction of energy consumption at Electric Vehicle (EV) Charging Stations at UCLA. For this interactive user application, the total time of accessing database, processing the data and making the prediction, needs to be within a few seconds. We analyze four relatively fast Machine Learning based time series prediction algorithms for our prediction engine: Historical Average, kNearest Neighbor, Weighted k-Nearest Neighbor, and Lazy Learning. The Nearest Neighbor algorithm (k Nearest Neighbor with k=1) shows better performance and is selected to be the prediction algorithm implemented for themore » cellphone application. Two applications have been designed on top of the prediction algorithm: one predicts the expected available energy at the station and the other one predicts the expected charging finishing time. The total time, including accessing the database, data processing, and prediction is about one second for both applications.« less

  20. Development of clean coal and clean soil technologies using advanced agglomeration technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ignasiak, B.; Pawlak, W.; Szymocha, K.

    1990-04-01

    The specific objectives of the bituminous coal program were to explore and evaluate the application of advanced agglomeration technology for: (1)desulphurization of bituminous coals to sulphur content acceptable within the current EPA SO{sub 2} emission guidelines; (2) deashing of bituminous coals to ash content of less than 10 percent; and (3)increasing the calorific value of bituminous coals to above 13,000 Btu/lb. (VC)

  1. Photo-oxidation catalysts

    DOEpatents

    Pitts, J Roland [Lakewood, CO; Liu, Ping [Irvine, CA; Smith, R Davis [Golden, CO

    2009-07-14

    Photo-oxidation catalysts and methods for cleaning a metal-based catalyst are disclosed. An exemplary catalyst system implementing a photo-oxidation catalyst may comprise a metal-based catalyst, and a photo-oxidation catalyst for cleaning the metal-based catalyst in the presence of light. The exposure to light enables the photo-oxidation catalyst to substantially oxidize absorbed contaminants and reduce accumulation of the contaminants on the metal-based catalyst. Applications are also disclosed.

  2. Large-Volume Resonant Microwave Discharge for Plasma Cleaning of a CEBAF 5-Cell SRF Cavity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Mammosser, S. Ahmed, K. Macha, J. Upadhyay, M. Nikoli, S. Popovi, L. Vuakovi

    2012-07-01

    We report the preliminary results on plasma generation in a 5-cell CEBAF superconducting radio-frequency (SRF) cavity for the application of cavity interior surface cleaning. CEBAF currently has {approx}300 of these five cell cavities installed in the Jefferson Lab accelerator which are mostly limited by cavity surface contamination. The development of an in-situ cavity surface cleaning method utilizing a resonant microwave discharge could lead to significant CEBAF accelerator performance improvement. This microwave discharge is currently being used for the development of a set of plasma cleaning procedures targeted to the removal of various organic, metal and metal oxide impurities. These contaminantsmore » are responsible for the increase of surface resistance and the reduction of RF performance in installed cavities. The CEBAF five cell cavity volume is {approx} 0.5 m2, which places the discharge in the category of large-volume plasmas. CEBAF cavity has a cylindrical symmetry, but its elliptical shape and transversal power coupling makes it an unusual plasma application, which requires special consideration of microwave breakdown. Our preliminary study includes microwave breakdown and optical spectroscopy, which was used to define the operating pressure range and the rate of removal of organic impurities.« less

  3. 40 CFR 63.4081 - Am I subject to this subpart?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... appliance parts and products; (2) Preparation of a coating for application (e.g., mixing in thinners and..., spray guns or dip tanks; (4) Application of porcelain enamel, powder coating, and asphalt interior...) Cleaning of equipment used in coating operations (e.g., application equipment, hangers, racks); (7) Storage...

  4. 40 CFR 60.620 - Applicability and designation of affected facility.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Performance for Petroleum Dry Cleaners § 60.620 Applicability and designation of affected facility. (a) The provisions of this subpart are applicable to the following affected facilities located at a petroleum dry... pounds): Petroleum solvent dry cleaning dryers, washers, filters, stills, and settling tanks. (1) When...

  5. 7 CFR 634.13 - Project applications.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 6 2014-01-01 2014-01-01 false Project applications. 634.13 Section 634.13..., DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.13 Project applications. (a) The SRCWCC is to assure that a process exists to prepare the RCWP...

  6. 7 CFR 634.13 - Project applications.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 6 2013-01-01 2013-01-01 false Project applications. 634.13 Section 634.13..., DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.13 Project applications. (a) The SRCWCC is to assure that a process exists to prepare the RCWP...

  7. Processing method of images obtained during the TESIS/CORONAS-PHOTON experiment

    NASA Astrophysics Data System (ADS)

    Kuzin, S. V.; Shestov, S. V.; Bogachev, S. A.; Pertsov, A. A.; Ulyanov, A. S.; Reva, A. A.

    2011-04-01

    In January 2009, the CORONAS-PHOTON spacecraft was successfully launched. It includes a set of telescopes and spectroheliometers—TESIS—designed to image the solar corona in soft X-ray and EUV spectral ranges. Due to features of the reading system, to obtain physical information from these images, it is necessary to preprocess them, i.e., to remove the background, correct the white field, level, and clean. The paper discusses the algorithms and software developed and used for the preprocessing of images.

  8. PRN 93-3: Labeling Statement Prohibiting Application to Water

    EPA Pesticide Factsheets

    This notice explaining the policy on label statement prohibiting pesticide application to water pertains only to the labeling statement on pesticide products. It does not address the term wetlands as defined with respect to the Clean Water Act.

  9. 40 CFR 406.60 - Applicability; description of the parboiled rice processing subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... parboiled rice processing subcategory. 406.60 Section 406.60 Protection of Environment ENVIRONMENTAL... Rice Processing Subcategory § 406.60 Applicability; description of the parboiled rice processing... rice is cleaned, cooked and dried before being milled. ...

  10. Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seong W. Lee

    The project entitled, ''Innovative Instrumentation and Analysis of the Temperature Measurement for High Temperature Gasification'', was successfully completed by the Principal Investigator, Dr. S. Lee and his research team in the Center for Advanced Energy Systems and Environmental Control Technologies at Morgan State University. The major results and outcomes were presented in semi-annual progress reports and annual project review meetings/presentations. Specifically, the literature survey including the gasifier temperature measurement, the ultrasonic application in cleaning application, and spray coating process and the gasifier simulator (cold model) testing has been successfully conducted during the first year. The results show that four factorsmore » (blower voltage, ultrasonic application, injection time intervals, particle weight) were considered as significant factors that affect the temperature measurement. Then the gasifier simulator (hot model) design and the fabrication as well as the systematic tests on hot model were completed to test the significant factors on temperature measurement in the second year. The advanced Industrial analytic methods such as statistics-based experimental design, analysis of variance (ANOVA) and regression methods were applied in the hot model tests. The results show that operational parameters (i.e. air flow rate, water flow rate, fine dust particle amount, ammonia addition) presented significant impact on the temperature measurement inside the gasifier simulator. The experimental design and ANOVA are very efficient way to design and analyze the experiments. The results show that the air flow rate and fine dust particle amount are statistically significant to the temperature measurement. The regression model provided the functional relation between the temperature and these factors with substantial accuracy. In the last year of the project period, the ultrasonic and subsonic cleaning methods and coating materials were tested/applied on the thermocouple cleaning according to the proposed approach. Different frequency, application time and power of the ultrasonic/subsonic output were tested. The results show that the ultrasonic approach is one of the best methods to clean the thermocouple tips during the routine operation of the gasifier. In addition, the real time data acquisition system was also designed and applied in the experiments. This advanced instrumentation provided the efficient and accurate data acquisition for this project. In summary, the accomplishment of the project provided useful information of the ultrasonic cleaning method applied in thermocouple tip cleaning. The temperature measurement could be much improved both in accuracy and duration provided that the proposed approach is widely used in the gasification facilities.« less

  11. The latest developments and outlook for hydrogen liquefaction technology

    NASA Astrophysics Data System (ADS)

    Ohlig, K.; Decker, L.

    2014-01-01

    Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence higher operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.

  12. Natural Inspired Intelligent Visual Computing and Its Application to Viticulture.

    PubMed

    Ang, Li Minn; Seng, Kah Phooi; Ge, Feng Lu

    2017-05-23

    This paper presents an investigation of natural inspired intelligent computing and its corresponding application towards visual information processing systems for viticulture. The paper has three contributions: (1) a review of visual information processing applications for viticulture; (2) the development of natural inspired computing algorithms based on artificial immune system (AIS) techniques for grape berry detection; and (3) the application of the developed algorithms towards real-world grape berry images captured in natural conditions from vineyards in Australia. The AIS algorithms in (2) were developed based on a nature-inspired clonal selection algorithm (CSA) which is able to detect the arcs in the berry images with precision, based on a fitness model. The arcs detected are then extended to perform the multiple arcs and ring detectors information processing for the berry detection application. The performance of the developed algorithms were compared with traditional image processing algorithms like the circular Hough transform (CHT) and other well-known circle detection methods. The proposed AIS approach gave a Fscore of 0.71 compared with Fscores of 0.28 and 0.30 for the CHT and a parameter-free circle detection technique (RPCD) respectively.

  13. Development of MODIS data-based algorithm for retrieving sea surface temperature in coastal waters.

    PubMed

    Wang, Jiao; Deng, Zhiqiang

    2017-06-01

    A new algorithm was developed for retrieving sea surface temperature (SST) in coastal waters using satellite remote sensing data from Moderate Resolution Imaging Spectroradiometer (MODIS) aboard Aqua platform. The new SST algorithm was trained using the Artificial Neural Network (ANN) method and tested using 8 years of remote sensing data from MODIS Aqua sensor and in situ sensing data from the US coastal waters in Louisiana, Texas, Florida, California, and New Jersey. The ANN algorithm could be utilized to map SST in both deep offshore and particularly shallow nearshore waters at the high spatial resolution of 1 km, greatly expanding the coverage of remote sensing-based SST data from offshore waters to nearshore waters. Applications of the ANN algorithm require only the remotely sensed reflectance values from the two MODIS Aqua thermal bands 31 and 32 as input data. Application results indicated that the ANN algorithm was able to explaining 82-90% variations in observed SST in US coastal waters. While the algorithm is generally applicable to the retrieval of SST, it works best for nearshore waters where important coastal resources are located and existing algorithms are either not applicable or do not work well, making the new ANN-based SST algorithm unique and particularly useful to coastal resource management.

  14. Solving a combinatorial problem via self-organizing process: an application of the Kohonen algorithm to the traveling salesman problem.

    PubMed

    Fort, J C

    1988-01-01

    We present an application of the Kohonen algorithm to the traveling salesman problem: Using only this algorithm, without energy function nor any parameter chosen "ad hoc", we found good suboptimal tours. We give a neural model version of this algorithm, closer to classical neural networks. This is illustrated with various numerical examples.

  15. Rice- and butterfly-wing effect inspired self-cleaning and low drag micro/nanopatterned surfaces in water, oil, and air flow.

    PubMed

    Bixler, Gregory D; Bhushan, Bharat

    2014-01-07

    In search of new solutions to complex challenges, researchers are turning to living nature for inspiration. For example, special surface characteristics of rice leaves and butterfly wings combine the shark skin (anisotropic flow leading to low drag) and lotus leaf (superhydrophobic and self-cleaning) effects, producing the so-called rice and butterfly wing effect. In this paper, we study four microstructured surfaces inspired by rice leaves and fabricated with photolithography techniques. We also present a method of creating such surfaces using a hot embossing procedure for scaled-up manufacturing. Fluid drag, self-cleaning, contact angle, and contact angle hysteresis data are presented to understand the role of sample geometrical dimensions. Conceptual modeling provides design guidance when developing novel low drag, self-cleaning, and potentially antifouling surfaces for medical, marine, and industrial applications.

  16. A self-cleaning underwater superoleophobic mesh for oil-water separation.

    PubMed

    Zhang, Lianbin; Zhong, Yujiang; Cha, Dongkyu; Wang, Peng

    2013-01-01

    Oil-water separation has recently become a global challenging task because of the frequent occurrence of oil spill accidents due to the offshore oil production and transportation, and there is an increasing demand for the development of effective and inexpensive approaches for the cleaning-up of the oily pollution in water system. In this study, a self-cleaning underwater superoleophobic mesh that can be used for oil-water separation is prepared by the layer-by-layer (LbL) assembly of sodium silicate and TiO2 nanoparticles on the stainless steel mesh. The integration of the self-cleaning property into the all-inorganic separation mesh by using TiO2 enables the convenient removal of the contaminants by ultraviolet (UV) illumination, and allows for the facile recovery of the separation ability of the contaminated mesh, making it promising for practial oil-water separation applications.

  17. Dry cleaning of Turkish coal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cicek, T.

    2008-07-01

    This study dealt with the upgrading of two different type of Turkish coal by a dry cleaning method using a modified air table. The industrial size air table used in this study is a device for removing stones from agricultural products. This study investigates the technical and economical feasibility of the dry cleaning method which has never been applied before on coals in Turkey. The application of a dry cleaning method on Turkish coals designated for power generation without generating environmental pollution and ensuring a stable coal quality are the main objectives of this study. The size fractions of 5-8,more » 3-5, and 1-3 mm of the investigated coals were used in the upgrading experiments. Satisfactory results were achieved with coal from the Soma region, whereas the upgrading results of Hsamlar coal were objectionable for the coarser size fractions. However, acceptable results were obtained for the size fraction 1-3 mm of Hsamlar coal.« less

  18. A self-cleaning underwater superoleophobic mesh for oil-water separation

    PubMed Central

    Zhang, Lianbin; Zhong, Yujiang; Cha, Dongkyu; Wang, Peng

    2013-01-01

    Oil–water separation has recently become a global challenging task because of the frequent occurrence of oil spill accidents due to the offshore oil production and transportation, and there is an increasing demand for the development of effective and inexpensive approaches for the cleaning-up of the oily pollution in water system. In this study, a self-cleaning underwater superoleophobic mesh that can be used for oil-water separation is prepared by the layer-by-layer (LbL) assembly of sodium silicate and TiO2 nanoparticles on the stainless steel mesh. The integration of the self-cleaning property into the all-inorganic separation mesh by using TiO2 enables the convenient removal of the contaminants by ultraviolet (UV) illumination, and allows for the facile recovery of the separation ability of the contaminated mesh, making it promising for practial oil-water separation applications. PMID:23900109

  19. Behavior-Based Cleaning for Unreliable RFID Data Sets

    PubMed Central

    Fan, Hua; Wu, Quanyuan; Lin, Yisong

    2012-01-01

    Radio Frequency IDentification (RFID) technology promises to revolutionize the way we track items and assets, but in RFID systems, missreading is a common phenomenon and it poses an enormous challenge to RFID data management, so accurate data cleaning becomes an essential task for the successful deployment of systems. In this paper, we present the design and development of a RFID data cleaning system, the first declarative, behavior-based unreliable RFID data smoothing system. We take advantage of kinematic characteristics of tags to assist in RFID data cleaning. In order to establish the conversion relationship between RFID data and kinematic parameters of the tags, we propose a movement behavior detection model. Moreover, a Reverse Order Filling Mechanism is proposed to ensure a more complete access to get the movement behavior characteristics of tag. Finally, we validate our solution with a common RFID application and demonstrate the advantages of our approach through extensive simulations. PMID:23112595

  20. Behavior-based cleaning for unreliable RFID data sets.

    PubMed

    Fan, Hua; Wu, Quanyuan; Lin, Yisong

    2012-01-01

    Radio Frequency IDentification (RFID) technology promises to revolutionize the way we track items and assets, but in RFID systems, missreading is a common phenomenon and it poses an enormous challenge to RFID data management, so accurate data cleaning becomes an essential task for the successful deployment of systems. In this paper, we present the design and development of a RFID data cleaning system, the first declarative, behavior-based unreliable RFID data smoothing system. We take advantage of kinematic characteristics of tags to assist in RFID data cleaning. In order to establish the conversion relationship between RFID data and kinematic parameters of the tags, we propose a movement behavior detection model. Moreover, a Reverse Order Filling Mechanism is proposed to ensure a more complete access to get the movement behavior characteristics of tag. Finally, we validate our solution with a common RFID application and demonstrate the advantages of our approach through extensive simulations.

  1. Self-cleaning and self-sanitizing coatings on plastic fabrics: design, manufacture and performance.

    PubMed

    Barletta, M; Vesco, S; Tagliaferri, V

    2014-08-01

    Self-cleaning and self-sanitizing coatings are of utmost interest in several manufacturing domains. In particular, fabrics and textile materials are often pre-treated by impregnation or incorporation with antimicrobial pesticides for protection purposes against bacteria and fungi that are pathogenic for man or other animals. In this respect, the present investigation deals with the design and manufacture of self-cleaning and self-sanitizing coatings on plastic fabrics. The functionalization of the coatings was yield by incorporating active inorganic matter alone (i.e., photo-catalytic TiO2 anatase and Ag(+) ions) inside an organic inorganic hybrid binder. The achieved formulations were deposited on coextruded polyvinylchloride-polyester fabrics by air-mix spraying and left to dry at ambient temperature. The performance of the resulting coatings were characterized for their self-cleaning and self-sanitizing ability according to standardized testing procedure and/or applicable international regulations. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Rice- and butterfly-wing effect inspired self-cleaning and low drag micro/nanopatterned surfaces in water, oil, and air flow

    NASA Astrophysics Data System (ADS)

    Bixler, Gregory D.; Bhushan, Bharat

    2013-12-01

    In search of new solutions to complex challenges, researchers are turning to living nature for inspiration. For example, special surface characteristics of rice leaves and butterfly wings combine the shark skin (anisotropic flow leading to low drag) and lotus leaf (superhydrophobic and self-cleaning) effects, producing the so-called rice and butterfly wing effect. In this paper, we study four microstructured surfaces inspired by rice leaves and fabricated with photolithography techniques. We also present a method of creating such surfaces using a hot embossing procedure for scaled-up manufacturing. Fluid drag, self-cleaning, contact angle, and contact angle hysteresis data are presented to understand the role of sample geometrical dimensions. Conceptual modeling provides design guidance when developing novel low drag, self-cleaning, and potentially antifouling surfaces for medical, marine, and industrial applications.

  3. Development and application of unified algorithms for problems in computational science

    NASA Technical Reports Server (NTRS)

    Shankar, Vijaya; Chakravarthy, Sukumar

    1987-01-01

    A framework is presented for developing computationally unified numerical algorithms for solving nonlinear equations that arise in modeling various problems in mathematical physics. The concept of computational unification is an attempt to encompass efficient solution procedures for computing various nonlinear phenomena that may occur in a given problem. For example, in Computational Fluid Dynamics (CFD), a unified algorithm will be one that allows for solutions to subsonic (elliptic), transonic (mixed elliptic-hyperbolic), and supersonic (hyperbolic) flows for both steady and unsteady problems. The objectives are: development of superior unified algorithms emphasizing accuracy and efficiency aspects; development of codes based on selected algorithms leading to validation; application of mature codes to realistic problems; and extension/application of CFD-based algorithms to problems in other areas of mathematical physics. The ultimate objective is to achieve integration of multidisciplinary technologies to enhance synergism in the design process through computational simulation. Specific unified algorithms for a hierarchy of gas dynamics equations and their applications to two other areas: electromagnetic scattering, and laser-materials interaction accounting for melting.

  4. The Applications of Genetic Algorithms in Medicine.

    PubMed

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-11-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.].

  5. The Applications of Genetic Algorithms in Medicine

    PubMed Central

    Ghaheri, Ali; Shoar, Saeed; Naderan, Mohammad; Hoseini, Sayed Shahabuddin

    2015-01-01

    A great wealth of information is hidden amid medical research data that in some cases cannot be easily analyzed, if at all, using classical statistical methods. Inspired by nature, metaheuristic algorithms have been developed to offer optimal or near-optimal solutions to complex data analysis and decision-making tasks in a reasonable time. Due to their powerful features, metaheuristic algorithms have frequently been used in other fields of sciences. In medicine, however, the use of these algorithms are not known by physicians who may well benefit by applying them to solve complex medical problems. Therefore, in this paper, we introduce the genetic algorithm and its applications in medicine. The use of the genetic algorithm has promising implications in various medical specialties including radiology, radiotherapy, oncology, pediatrics, cardiology, endocrinology, surgery, obstetrics and gynecology, pulmonology, infectious diseases, orthopedics, rehabilitation medicine, neurology, pharmacotherapy, and health care management. This review introduces the applications of the genetic algorithm in disease screening, diagnosis, treatment planning, pharmacovigilance, prognosis, and health care management, and enables physicians to envision possible applications of this metaheuristic method in their medical career.] PMID:26676060

  6. Exploiting Redundancy and Application Scalability for Cost-Effective, Time-Constrained Execution of HPC Applications on Amazon EC2

    DOE PAGES

    Marathe, Aniruddha P.; Harris, Rachel A.; Lowenthal, David K.; ...

    2015-12-17

    The use of clouds to execute high-performance computing (HPC) applications has greatly increased recently. Clouds provide several potential advantages over traditional supercomputers and in-house clusters. The most popular cloud is currently Amazon EC2, which provides fixed-cost and variable-cost, auction-based options. The auction market trades lower cost for potential interruptions that necessitate checkpointing; if the market price exceeds the bid price, a node is taken away from the user without warning. We explore techniques to maximize performance per dollar given a time constraint within which an application must complete. Specifically, we design and implement multiple techniques to reduce expected cost bymore » exploiting redundancy in the EC2 auction market. We then design an adaptive algorithm that selects a scheduling algorithm and determines the bid price. We show that our adaptive algorithm executes programs up to seven times cheaper than using the on-demand market and up to 44 percent cheaper than the best non-redundant, auction-market algorithm. We extend our adaptive algorithm to incorporate application scalability characteristics for further cost savings. In conclusion, we show that the adaptive algorithm informed with scalability characteristics of applications achieves up to 56 percent cost savings compared to the expected cost for the base adaptive algorithm run at a fixed, user-defined scale.« less

  7. Air powder abrasive treatment as an implant surface cleaning method: a literature review.

    PubMed

    Tastepe, Ceylin S; van Waas, Rien; Liu, Yuelian; Wismeijer, Daniel

    2012-01-01

    To evaluate the air powder abrasive treatment as an implant surface cleaning method for peri-implantitis based on the existing literature. A PubMed search was conducted to find articles that reported on air powder abrasive treatment as an implant surface cleaning method for peri-implantitis. The studies evaluated cleaning efficiency and surface change as a result of the method. Furthermore, cell response toward the air powder abrasive-treated discs, reosseointegration, and clinical outcome after treatment is also reported. The PubMed search resulted in 27 articles meeting the inclusion criteria. In vitro cleaning efficiency of the method is reported to be high. The method resulted in minor surface changes on titanium specimens. Although the air powder abrasive-treated specimens showed sufficient levels of cell attachment and cell viability, the cell response decreased compared with sterile discs. Considerable reosseointegration between 39% and 46% and improved clinical parameters were reported after treatment when applied in combination with surgical treatment. The results of the treatment are influenced by the powder type used, the application time, and whether powder was applied surgically or nonsurgically. The in vivo data on air powder abrasive treatment as an implant surface cleaning method is not sufficient to draw definitive conclusions. However, in vitro results allow the clinician to consider the method as a promising option for implant surface cleaning in peri-implantitis treatment.

  8. ComprehensiveBench: a Benchmark for the Extensive Evaluation of Global Scheduling Algorithms

    NASA Astrophysics Data System (ADS)

    Pilla, Laércio L.; Bozzetti, Tiago C.; Castro, Márcio; Navaux, Philippe O. A.; Méhaut, Jean-François

    2015-10-01

    Parallel applications that present tasks with imbalanced loads or complex communication behavior usually do not exploit the underlying resources of parallel platforms to their full potential. In order to mitigate this issue, global scheduling algorithms are employed. As finding the optimal task distribution is an NP-Hard problem, identifying the most suitable algorithm for a specific scenario and comparing algorithms are not trivial tasks. In this context, this paper presents ComprehensiveBench, a benchmark for global scheduling algorithms that enables the variation of a vast range of parameters that affect performance. ComprehensiveBench can be used to assist in the development and evaluation of new scheduling algorithms, to help choose a specific algorithm for an arbitrary application, to emulate other applications, and to enable statistical tests. We illustrate its use in this paper with an evaluation of Charm++ periodic load balancers that stresses their characteristics.

  9. Cleaning Products and Air Fresheners: Emissions and ResultingConcentrations of Glycol Ethers and Terpenoids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singer, Brett C.; Destaillat, Hugo; Hodgson, Alfred T.

    2005-08-01

    Experiments were conducted to quantify emissions and concentrations of glycol ethers and terpenoids from cleaning product and air freshener use in a 50-m{sup 3} room ventilated at {approx}0.5 h{sup -1}. Five cleaning products were applied full-strength (FS); three were additionally used in dilute solution. FS application of pine-oil cleaner (POC) yielded 1-h concentrations of 10-1300 {micro}g m{sup -3} for individual terpenoids, including {alpha}-terpinene (90-120), d-limonene (1000-1100), terpinolene (900-1300), and {alpha}-terpineol (260-700). One-hour concentrations of 2-butoxyethanol and/or dlimonene were 300-6000 {micro}g m{sup -3} after FS use of other products. During FS application including rinsing with sponge and wiping with towels, fractionalmore » emissions (mass volatilized/dispensed) of 2-butoxyethanol and d-limonene were 50-100% with towels retained, {approx}25-50% when towels were removed after cleaning. Lower fractions (2-11%) resulted from dilute use. Fractional emissions of terpenes from FS use of POC were {approx}35-70% with towels retained, 20-50% with towels removed. During floor cleaning with dilute solution of POC, 7-12% of dispensed terpenes were emitted. Terpene alcohols were emitted at lower fractions: 7-30% (FS, towels retained), 2-9% (FS, towels removed), and 2-5% (dilute). During air-freshener use, d-limonene, dihydromyrcenol, linalool, linalyl acetate, and {beta}-citronellol were emitted at 35-180 mg d{sup -1} over three days while air concentrations averaged 30-160 {micro}g m{sup -3}.« less

  10. Neural decoding of attentional selection in multi-speaker environments without access to clean sources

    NASA Astrophysics Data System (ADS)

    O'Sullivan, James; Chen, Zhuo; Herrero, Jose; McKhann, Guy M.; Sheth, Sameer A.; Mehta, Ashesh D.; Mesgarani, Nima

    2017-10-01

    Objective. People who suffer from hearing impairments can find it difficult to follow a conversation in a multi-speaker environment. Current hearing aids can suppress background noise; however, there is little that can be done to help a user attend to a single conversation amongst many without knowing which speaker the user is attending to. Cognitively controlled hearing aids that use auditory attention decoding (AAD) methods are the next step in offering help. Translating the successes in AAD research to real-world applications poses a number of challenges, including the lack of access to the clean sound sources in the environment with which to compare with the neural signals. We propose a novel framework that combines single-channel speech separation algorithms with AAD. Approach. We present an end-to-end system that (1) receives a single audio channel containing a mixture of speakers that is heard by a listener along with the listener’s neural signals, (2) automatically separates the individual speakers in the mixture, (3) determines the attended speaker, and (4) amplifies the attended speaker’s voice to assist the listener. Main results. Using invasive electrophysiology recordings, we identified the regions of the auditory cortex that contribute to AAD. Given appropriate electrode locations, our system is able to decode the attention of subjects and amplify the attended speaker using only the mixed audio. Our quality assessment of the modified audio demonstrates a significant improvement in both subjective and objective speech quality measures. Significance. Our novel framework for AAD bridges the gap between the most recent advancements in speech processing technologies and speech prosthesis research and moves us closer to the development of cognitively controlled hearable devices for the hearing impaired.

  11. A machine learning approach to multi-level ECG signal quality classification.

    PubMed

    Li, Qiao; Rajagopalan, Cadathur; Clifford, Gari D

    2014-12-01

    Current electrocardiogram (ECG) signal quality assessment studies have aimed to provide a two-level classification: clean or noisy. However, clinical usage demands more specific noise level classification for varying applications. This work outlines a five-level ECG signal quality classification algorithm. A total of 13 signal quality metrics were derived from segments of ECG waveforms, which were labeled by experts. A support vector machine (SVM) was trained to perform the classification and tested on a simulated dataset and was validated using data from the MIT-BIH arrhythmia database (MITDB). The simulated training and test datasets were created by selecting clean segments of the ECG in the 2011 PhysioNet/Computing in Cardiology Challenge database, and adding three types of real ECG noise at different signal-to-noise ratio (SNR) levels from the MIT-BIH Noise Stress Test Database (NSTDB). The MITDB was re-annotated for five levels of signal quality. Different combinations of the 13 metrics were trained and tested on the simulated datasets and the best combination that produced the highest classification accuracy was selected and validated on the MITDB. Performance was assessed using classification accuracy (Ac), and a single class overlap accuracy (OAc), which assumes that an individual type classified into an adjacent class is acceptable. An Ac of 80.26% and an OAc of 98.60% on the test set were obtained by selecting 10 metrics while 57.26% (Ac) and 94.23% (OAc) were the numbers for the unseen MITDB validation data without retraining. By performing the fivefold cross validation, an Ac of 88.07±0.32% and OAc of 99.34±0.07% were gained on the validation fold of MITDB. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. A Data Cleaning Method for Big Trace Data Using Movement Consistency

    PubMed Central

    Tang, Luliang; Zhang, Xia; Li, Qingquan

    2018-01-01

    Given the popularization of GPS technologies, the massive amount of spatiotemporal GPS traces collected by vehicles are becoming a new kind of big data source for urban geographic information extraction. The growing volume of the dataset, however, creates processing and management difficulties, while the low quality generates uncertainties when investigating human activities. Based on the conception of the error distribution law and position accuracy of the GPS data, we propose in this paper a data cleaning method for this kind of spatial big data using movement consistency. First, a trajectory is partitioned into a set of sub-trajectories using the movement characteristic points. In this process, GPS points indicate that the motion status of the vehicle has transformed from one state into another, and are regarded as the movement characteristic points. Then, GPS data are cleaned based on the similarities of GPS points and the movement consistency model of the sub-trajectory. The movement consistency model is built using the random sample consensus algorithm based on the high spatial consistency of high-quality GPS data. The proposed method is evaluated based on extensive experiments, using GPS trajectories generated by a sample of vehicles over a 7-day period in Wuhan city, China. The results show the effectiveness and efficiency of the proposed method. PMID:29522456

  13. Dirt detection on brown eggs by means of color computer vision.

    PubMed

    Mertens, K; De Ketelaere, B; Kamers, B; Bamelis, F R; Kemps, B J; Verhoelst, E M; De Baerdemaeker, J G; Decuypere, E M

    2005-10-01

    In the last 20 yr, different methods for detecting defects in eggs were developed. Until now, no satisfying technique existed to sort and quantify dirt on eggshells. The work presented here focuses on the design of an off-line computer vision system to differentiate and quantify the presence of different dirt stains on brown eggs: dark (feces), white (uric acid), blood, and yolk stains. A system that provides uniform light exposure around the egg was designed. In this uniform light, pictures of dirty and clean eggs were taken, stored, and analyzed. The classification was based on a few standard logical operators, allowing for a quick implementation in an online set-up. In an experiment, 100 clean and 100 dirty eggs were used to validate the classification algorithm. The designed vision system showed an accuracy of 99% for the detection of dirt stains. Two percent of the clean eggs had a light-colored eggshell and were subsequently mistaken for showing large white stains. The accuracy of differentiation of the different kinds of dirt stains was 91%. Of the eggs with dark stains, 10.81% were mistaken for having bloodstains, and 33.33% of eggs with bloodstains were mistaken for having dark stains. The developed system is possibly a first step toward an on line dirt evaluation technique for brown eggs.

  14. A street rubbish detection algorithm based on Sift and RCNN

    NASA Astrophysics Data System (ADS)

    Yu, XiPeng; Chen, Zhong; Zhang, Shuo; Zhang, Ting

    2018-02-01

    This paper presents a street rubbish detection algorithm based on image registration with Sift feature and RCNN. Firstly, obtain the rubbish region proposal on the real-time street image and set up the CNN convolution neural network trained by the rubbish samples set consists of rubbish and non-rubbish images; Secondly, for every clean street image, obtain the Sift feature and do image registration with the real-time street image to obtain the differential image, the differential image filters a lot of background information, obtain the rubbish region proposal rect where the rubbish may appear on the differential image by the selective search algorithm. Then, the CNN model is used to detect the image pixel data in each of the region proposal on the real-time street image. According to the output vector of the CNN, it is judged whether the rubbish is in the region proposal or not. If it is rubbish, the region proposal on the real-time street image is marked. This algorithm avoids the large number of false detection caused by the detection on the whole image because the CNN is used to identify the image only in the region proposal on the real-time street image that may appear rubbish. Different from the traditional object detection algorithm based on the region proposal, the region proposal is obtained on the differential image not whole real-time street image, and the number of the invalid region proposal is greatly reduced. The algorithm has the high mean average precision (mAP).

  15. 40 CFR Table 1 to Subpart B of... - Section 112(j) Part 2 Application Due Dates

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...—Section 112(j) Part 2 Application Due Dates Due date MACT standard 10/30/03 Combustion Turbines.Lime...

  16. 40 CFR Table 1 to Subpart B of... - Section 112(j) Part 2 Application Due Dates

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...—Section 112(j) Part 2 Application Due Dates Due date MACT standard 10/30/03 Combustion Turbines.Lime...

  17. 40 CFR Table 1 to Subpart B of... - Section 112(j) Part 2 Application Due Dates

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...—Section 112(j) Part 2 Application Due Dates Due date MACT standard 10/30/03 Combustion Turbines.Lime...

  18. 40 CFR Table 1 to Subpart B of... - Section 112(j) Part 2 Application Due Dates

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CATEGORIES Requirements for Control Technology Determinations for Major Sources in Accordance With Clean Air...—Section 112(j) Part 2 Application Due Dates Due date MACT standard 10/30/03 Combustion Turbines.Lime...

  19. Safety risks of hydrogen fuel for applications in transportation vehicles.

    DOT National Transportation Integrated Search

    2009-04-01

    Combustion of hydrocarbon fuels in many practical applications produces pollutants that are harmful to human health and environment. Hydrogen fuel is considered to be a potential answer to the clean energy demands, especially with the advances in fue...

  20. Gas-phase optical fiber photocatalytic reactors for indoor air application: a preliminary study on performance indicators

    NASA Astrophysics Data System (ADS)

    Palmiste, Ü.; Voll, H.

    2017-10-01

    The development of advanced air cleaning technologies aims to reduce building energy consumption by reduction of outdoor air flow rates while keeping the indoor air quality at an acceptable level by air cleaning. Photocatalytic oxidation is an emerging technology for gas-phase air cleaning that can be applied in a standalone unit or a subsystem of a building mechanical ventilation system. Quantitative information on photocatalytic reactor performance is required to evaluate the technical and economic viability of the advanced air cleaning by PCO technology as an energy conservation measure in a building air conditioning system. Photocatalytic reactors applying optical fibers as light guide or photocatalyst coating support have been reported as an approach to address the current light utilization problems and thus, improve the overall efficiency. The aim of the paper is to present a preliminary evaluation on continuous flow optical fiber photocatalytic reactors based on performance indicators commonly applied for air cleaners. Based on experimental data, monolith-type optical fiber reactor performance surpasses annular-type optical fiber reactors in single-pass removal efficiency, clean air delivery rate and operating cost efficiency.

  1. Wash or wipe? A comparative study of skin physiological changes between water washing and wiping after skin cleaning.

    PubMed

    Ogai, K; Matsumoto, M; Aoki, M; Ota, R; Hashimoto, K; Wada, R; Kobayashi, M; Sugama, J

    2017-11-01

    Presently, skin-cleaning agents that claim to be removed by water or wiping alone are commercially available and have been used for the purpose of bed baths. However, there is a lack of knowledge on how water washing and wiping differently affect skin physiological functions or ceramide content. The aim of this study was to compare the effects of water washing and wiping on skin physiological functions and ceramide content. Three kinds of the cleaning agents with different removal techniques (ie, water washing and wiping) were used in this study. Skin physiological functions (ie, transepidermal water loss, skin hydration, and skin pH) and skin ceramide content were measured before and after seven consecutive days of the application of each cleaning agent. No significant differences in skin physiological functions or ceramide content were observed between water washing and wiping. Cleaning agents that claim to be removed by water washing or wiping do not affect skin physiological functions or ceramide content by either removal method. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Toward clean suspended CVD graphene

    DOE PAGES

    Yulaev, Alexander; Univ. of Maryland, College Park, MD; Cheng, Guangjun; ...

    2016-08-26

    The application of suspended graphene as electron transparent supporting media in electron microscopy, vacuum electronics, and micromechanical devices requires the least destructive and maximally clean transfer from their original growth substrate to the target of interest. Here, we use thermally evaporated anthracene films as the sacrificial layer for graphene transfer onto an arbitrary substrate. We show that clean suspended graphene can be achieved via desorbing the anthracene layer at temperatures in the 100 °C to 150 °C range, followed by two sequential annealing steps for the final cleaning, using a Pt catalyst and activated carbon. The cleanliness of the suspendedmore » graphene membranes was analyzed employing the high surface sensitivity of low energy scanning electron microscopy and X-ray photoelectron spectroscopy. A quantitative comparison with two other commonly used transfer methods revealed the superiority of the anthracene approach to obtain a larger area of clean, suspended CVD graphene. Lastly, our graphene transfer method based on anthracene paves the way for integrating cleaner graphene in various types of complex devices, including the ones that are heat and humidity sensitive.« less

  3. [Application of nanoscale material in environmental remediation and its eco-environmental toxicity assessment: a review].

    PubMed

    Wang, Meng; Chen, Shi-Bao; Ma, Yi-Bing

    2010-11-01

    Though it has been claimed that nanotechnology has great potential in environmental cleaning, caution is required to the application of nano-particles (<100 nm). The studies relevant to organism exposure have shown that nano-particles can be hazardous. Currently, more papers are available about the remediation efficiency, characteristics, and mechanisms of manufactured nanoparticles after applied into polluted environment, but few studies are conducted about the ecotoxicological effects of the nano-particles. This paper reviewed the current researches on the hazards of nano- or ultrafine particles in environmental detoxification, discussed the potential environmental risks of applying nano-particles, and prospected the perspectives of the nanoparticles in environmental cleaning research.

  4. Gulf Coast Clean Energy Application Center

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dillingham, Gavin

    The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less

  5. Soldering and brazing safety guide: A handbook on space practice for those involved in soldering and brazing

    NASA Astrophysics Data System (ADS)

    This manual provides those involved in welding and brazing with effective safety procedures for use in performance of their jobs. Hazards exist in four types of general soldering and brazing processes: (1) cleaning; (2) application of flux; (3) application of heat and filler metal; and (4) residue cleaning. Most hazards during those operations can be avoided by using care, proper ventilation, protective clothing and equipment. Specific process hazards for various methods of brazing and soldering are treated. Methods to check ventilation are presented as well as a check of personal hygiene and good maintenance practices are stressed. Several emergency first aid treatments are described.

  6. Searching Process with Raita Algorithm and its Application

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Saleh Ahmar, Ansari; Abdullah, Dahlan; Hartama, Dedy; Napitupulu, Darmawan; Putera Utama Siahaan, Andysah; Hasan Siregar, Muhammad Noor; Nasution, Nurliana; Sundari, Siti; Sriadhi, S.

    2018-04-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  7. Trends for Electron Beam Accelerator Applications in Industry

    NASA Astrophysics Data System (ADS)

    Machi, Sueo

    2011-02-01

    Electron beam (EB) accelerators are major pieces of industrial equipment used for many commercial radiation processing applications. The industrial use of EB accelerators has a history of more than 50 years and is still growing in terms of both its economic scale and new applications. Major applications involve the modification of polymeric materials to create value-added products, such as heat-resistant wires, heat-shrinkable sheets, automobile tires, foamed plastics, battery separators and hydrogel wound dressing. The surface curing of coatings and printing inks is a growing application for low energy electron accelerators, resulting in an environmentally friendly and an energy-saving process. Recently there has been the acceptance of the use of EB accelerators in lieu of the radioactive isotope cobalt-60 as a source for sterilizing disposable medical products. Environmental protection by the use of EB accelerators is a new and important field of application. A commercial plant for the cleaning flue gases from a coal-burning power plant is in operation in Poland, employing high power EB accelerators. In Korea, a commercial plant uses EB to clean waste water from a dye factory.

  8. Simulator for beam-based LHC collimator alignment

    NASA Astrophysics Data System (ADS)

    Valentino, Gianluca; Aßmann, Ralph; Redaelli, Stefano; Sammut, Nicholas

    2014-02-01

    In the CERN Large Hadron Collider, collimators need to be set up to form a multistage hierarchy to ensure efficient multiturn cleaning of halo particles. Automatic algorithms were introduced during the first run to reduce the beam time required for beam-based setup, improve the alignment accuracy, and reduce the risk of human errors. Simulating the alignment procedure would allow for off-line tests of alignment policies and algorithms. A simulator was developed based on a diffusion beam model to generate the characteristic beam loss signal spike and decay produced when a collimator jaw touches the beam, which is observed in a beam loss monitor (BLM). Empirical models derived from the available measurement data are used to simulate the steady-state beam loss and crosstalk between multiple BLMs. The simulator design is presented, together with simulation results and comparison to measurement data.

  9. Research on cross - Project software defect prediction based on transfer learning

    NASA Astrophysics Data System (ADS)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  10. JPEG 2000 Encoding with Perceptual Distortion Control

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Liu, Zhen; Karam, Lina J.

    2008-01-01

    An alternative approach has been devised for encoding image data in compliance with JPEG 2000, the most recent still-image data-compression standard of the Joint Photographic Experts Group. Heretofore, JPEG 2000 encoding has been implemented by several related schemes classified as rate-based distortion-minimization encoding. In each of these schemes, the end user specifies a desired bit rate and the encoding algorithm strives to attain that rate while minimizing a mean squared error (MSE). While rate-based distortion minimization is appropriate for transmitting data over a limited-bandwidth channel, it is not the best approach for applications in which the perceptual quality of reconstructed images is a major consideration. A better approach for such applications is the present alternative one, denoted perceptual distortion control, in which the encoding algorithm strives to compress data to the lowest bit rate that yields at least a specified level of perceptual image quality. Some additional background information on JPEG 2000 is prerequisite to a meaningful summary of JPEG encoding with perceptual distortion control. The JPEG 2000 encoding process includes two subprocesses known as tier-1 and tier-2 coding. In order to minimize the MSE for the desired bit rate, a rate-distortion- optimization subprocess is introduced between the tier-1 and tier-2 subprocesses. In tier-1 coding, each coding block is independently bit-plane coded from the most-significant-bit (MSB) plane to the least-significant-bit (LSB) plane, using three coding passes (except for the MSB plane, which is coded using only one "clean up" coding pass). For M bit planes, this subprocess involves a total number of (3M - 2) coding passes. An embedded bit stream is then generated for each coding block. Information on the reduction in distortion and the increase in the bit rate associated with each coding pass is collected. This information is then used in a rate-control procedure to determine the contribution of each coding block to the output compressed bit stream.

  11. LASER APPLICATIONS AND OTHER TOPICS IN QUANTUM ELECTRONICS: Application of the stochastic parallel gradient descent algorithm for numerical simulation and analysis of the coherent summation of radiation from fibre amplifiers

    NASA Astrophysics Data System (ADS)

    Zhou, Pu; Wang, Xiaolin; Li, Xiao; Chen, Zilum; Xu, Xiaojun; Liu, Zejin

    2009-10-01

    Coherent summation of fibre laser beams, which can be scaled to a relatively large number of elements, is simulated by using the stochastic parallel gradient descent (SPGD) algorithm. The applicability of this algorithm for coherent summation is analysed and its optimisaton parameters and bandwidth limitations are studied.

  12. The Influence of Contamination and Cleaning on the Strength of Modular Head Taper Fixation in Total Hip Arthroplasty.

    PubMed

    Krull, Annika; Morlock, Michael M; Bishop, Nicholas E

    2017-10-01

    Intraoperative interface contamination of modular head-stem taper junctions of hip implants can lead to poor fixation strength, causing fretting and crevice corrosion or even stem taper fracture. Careful cleaning before assembly should help to reduce these problems. The purpose of this study was to determine the effect of cleaning (with and without drying) contaminated taper interfaces on the taper fixation strength. Metal or ceramic heads were impacted onto titanium alloy stem tapers with cleaned or contaminated (fat or saline solution) interfaces. The same procedure was performed after cleaning and drying the contaminated interfaces. Pull-off force was used to determine the influence of contamination and cleaning on the taper strength. Pull-off forces after contamination with fat were significantly lower than those for uncontaminated interfaces for both head materials. Pull-off forces after application of saline solution were not significantly different from those for uncontaminated tapers. However, a large variation in taper strength was observed, pull-off forces for cleaned and dried tapers were similar to those for uncontaminated tapers for both head materials. Intraoperative contamination of taper interfaces may be difficult to detect but has a major influence on taper fixation strength. Cleaning of the stem taper with saline solution and drying with gauze directly before assembly allows the taper strength of the pristine components to be achieved. Not drying the taper results in a large variation in pull-off forces, emphasizing that drying is essential for sufficient and reproducible fixation strength. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. 7 CFR 634.10 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 6 2013-01-01 2013-01-01 false Applicability. 634.10 Section 634.10 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.10...

  14. 7 CFR 634.10 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 6 2014-01-01 2014-01-01 false Applicability. 634.10 Section 634.10 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.10...

  15. 7 CFR 634.10 - Applicability.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 6 2012-01-01 2012-01-01 false Applicability. 634.10 Section 634.10 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.10...

  16. 7 CFR 634.10 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 6 2010-01-01 2010-01-01 false Applicability. 634.10 Section 634.10 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.10...

  17. 7 CFR 634.13 - Project applications.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding... State or areawide 208 water quality management plan. (c) Applications shall contain the following... water quality problem (3) Objectives and planned action, (4) Schedule for carrying out the plan, and (5...

  18. 7 CFR 634.10 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 6 2011-01-01 2011-01-01 false Applicability. 634.10 Section 634.10 Agriculture Regulations of the Department of Agriculture (Continued) NATURAL RESOURCES CONSERVATION SERVICE, DEPARTMENT OF AGRICULTURE LONG TERM CONTRACTING RURAL CLEAN WATER PROGRAM Project Authorization and Funding § 634.10...

  19. Li-Ion Pouch Cell Designs; Performance and Issues for Crewed Vehicle Applications

    NASA Technical Reports Server (NTRS)

    Darcy, Eric

    2011-01-01

    The purpose of this work: Are there any performance show stoppers for spinning them into spacecraft applications? (1) Are the seals compatible with extended vacuum operations? (2) How uniformly and cleanly are they made? (3) How durable are they?

  20. EVALUATION OF SUPERCRITICAL CARBON DIOXIDE TECHNOLOGY TO REDUCE SOLVENT IN SPRAY COATING APPLICATIONS

    EPA Science Inventory

    This evaluation, part of the Pollution Prevention Clean Technology Demonstration (CTD) Program, addresses the product quality, waste reduction, and economic issues of spray paint application using supercritical carbon dioxide (CO2). Anion Carbide has developed this technology and...

  1. Comparison of high‐intensity sound and mechanical vibration for cleaning porous titanium cylinders fabricated using selective laser melting

    PubMed Central

    Seiffert, Gary; Sutcliffe, Chris

    2015-01-01

    Abstract Orthopedic components, such as the acetabular cup in total hip joint replacement, can be fabricated using porous metals, such as titanium, and a number of processes, such as selective laser melting. The issue of how to effectively remove loose powder from the pores (residual powder) of such components has not been addressed in the literature. In this work, we investigated the feasibility of two processes, acoustic cleaning using high‐intensity sound inside acoustic horns and mechanical vibration, to remove residual titanium powder from selective laser melting‐fabricated cylinders. With acoustic cleaning, the amount of residual powder removed was not influenced by either the fundamental frequency of the horn used (75 vs. 230 Hz) or, for a given horn, the number of soundings (between 1 and 20). With mechanical vibration, the amount of residual powder removed was not influenced by the application time (10 vs. 20 s). Acoustic cleaning was found to be more reliable and effective in removal of residual powder than cleaning with mechanical vibration. It is concluded that acoustic cleaning using high‐intensity sound has significant potential for use in the final preparation stages of porous metal orthopedic components. © 2015 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 105B: 117–123, 2017. PMID:26426906

  2. Comparative Effectiveness of New Mechanical Irrigant Agitating Devices for Debris Removal from the Canal and Isthmus of Mesial Roots of Mandibular Molars.

    PubMed

    Duque, Jussaro Alves; Duarte, Marco Antonio Hungaro; Canali, Lyz Cristina Furquim; Zancan, Rafaela Fernandes; Vivan, Rodrigo Ricci; Bernardes, Ricardo Affonso; Bramante, Clovis Monteiro

    2017-02-01

    The aim of this study was to compare the effectiveness of Easy Clean (Easy Dental Equipment, Belo Horizonte, MG, Brazil) in continuous and reciprocating motion, passive ultrasonic irrigation (PUI), Endoactivator systems (Dentsply Maillefer, Ballaigues, Switzerland), and conventional irrigation for debris removal from root canals and isthmus. Fifty mesial roots of mandibular molars were embedded in epoxy resin using a metal muffle; afterward, the blocks containing the roots were sectioned at 2, 4, and 6 mm from the apex. After instrumentation, the roots were divided into 5 groups (n = 10) for application of the final irrigation protocol using Easy Clean in continuous rotation, Easy Clean in reciprocating motion, PUI, Endoactivator, and conventional irrigation. Scanning electron microscopic images were taken after instrumentation and after the first, second, and third activation of irrigating solution to evaluate the area of remaining debris with image J software (National Institutes of Health, Bethesda, MD). The protocol of 3 irrigating solution activations for 20 seconds provided better cleaning of the canal and isthmus. On conclusion of all procedures, analysis of the canals showed a statistical difference only at 2 mm; the Easy Clean in continuous rotation was more efficient than conventional irrigation (P < .05). On conclusion of all steps, the largest difference was observed in the isthmus in which the Easy Clean in continuous rotation was more effective than conventional irrigation at the 3 levels analyzed and the Endoactivator at 4 mm (P < .05). The PUI promoted greater cleaning than conventional irrigation at 6 mm (P < .05). There was no statistical difference between Easy Clean in continuous rotation, Easy Clean in reciprocating motion, and PUI (P > .05). Irrigating solution activation methods provided better cleaning of the canal and isthmus, especially the Easy Clean used in continuous rotation. The protocol of 3 irrigating solution activations for 20 seconds favored better cleaning. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  3. A Genetic-Based Scheduling Algorithm to Minimize the Makespan of the Grid Applications

    NASA Astrophysics Data System (ADS)

    Entezari-Maleki, Reza; Movaghar, Ali

    Task scheduling algorithms in grid environments strive to maximize the overall throughput of the grid. In order to maximize the throughput of the grid environments, the makespan of the grid tasks should be minimized. In this paper, a new task scheduling algorithm is proposed to assign tasks to the grid resources with goal of minimizing the total makespan of the tasks. The algorithm uses the genetic approach to find the suitable assignment within grid resources. The experimental results obtained from applying the proposed algorithm to schedule independent tasks within grid environments demonstrate the applicability of the algorithm in achieving schedules with comparatively lower makespan in comparison with other well-known scheduling algorithms such as, Min-min, Max-min, RASA and Sufferage algorithms.

  4. A computerized compensator design algorithm with launch vehicle applications

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Mcdaniel, W. L., Jr.

    1976-01-01

    This short paper presents a computerized algorithm for the design of compensators for large launch vehicles. The algorithm is applicable to the design of compensators for linear, time-invariant, control systems with a plant possessing a single control input and multioutputs. The achievement of frequency response specifications is cast into a strict constraint mathematical programming format. An improved solution algorithm for solving this type of problem is given, along with the mathematical necessities for application to systems of the above type. A computer program, compensator improvement program (CIP), has been developed and applied to a pragmatic space-industry-related example.

  5. Image processing meta-algorithm development via genetic manipulation of existing algorithm graphs

    NASA Astrophysics Data System (ADS)

    Schalkoff, Robert J.; Shaaban, Khaled M.

    1999-07-01

    Automatic algorithm generation for image processing applications is not a new idea, however previous work is either restricted to morphological operates or impractical. In this paper, we show recent research result in the development and use of meta-algorithms, i.e. algorithms which lead to new algorithms. Although the concept is generally applicable, the application domain in this work is restricted to image processing. The meta-algorithm concept described in this paper is based upon out work in dynamic algorithm. The paper first present the concept of dynamic algorithms which, on the basis of training and archived algorithmic experience embedded in an algorithm graph (AG), dynamically adjust the sequence of operations applied to the input image data. Each node in the tree-based representation of a dynamic algorithm with out degree greater than 2 is a decision node. At these nodes, the algorithm examines the input data and determines which path will most likely achieve the desired results. This is currently done using nearest-neighbor classification. The details of this implementation are shown. The constrained perturbation of existing algorithm graphs, coupled with a suitable search strategy, is one mechanism to achieve meta-algorithm an doffers rich potential for the discovery of new algorithms. In our work, a meta-algorithm autonomously generates new dynamic algorithm graphs via genetic recombination of existing algorithm graphs. The AG representation is well suited to this genetic-like perturbation, using a commonly- employed technique in artificial neural network synthesis, namely the blueprint representation of graphs. A number of exam. One of the principal limitations of our current approach is the need for significant human input in the learning phase. Efforts to overcome this limitation are discussed. Future research directions are indicated.

  6. Conjugate Gradient Algorithms For Manipulator Simulation

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Scheid, Robert E.

    1991-01-01

    Report discusses applicability of conjugate-gradient algorithms to computation of forward dynamics of robotic manipulators. Rapid computation of forward dynamics essential to teleoperation and other advanced robotic applications. Part of continuing effort to find algorithms meeting requirements for increased computational efficiency and speed. Method used for iterative solution of systems of linear equations.

  7. GEO-CAPE Coastal Ocean Ecosystem Dynamics White Paper ...

    EPA Pesticide Factsheets

    The Clean Water Act protects all navigable waters in the United States (CWA, 1988). The objective of the CWA is to "restore and maintain the chemical, physical, and biological integrity of the Nation's waters." This Federal mandate authorizes states, tribes, and U.S. territories, with guidance and oversight from the U.S. Environmental Protection Agency (EPA), to develop and implement water quality standards to protect the human and aquatic life uses of the Nation’s waterways. Water quality standards include designated uses, defined as the services that a water body supports such as drinking water, aquatic life, harvestable species, and recreation. These standards under the CWA Section 304(a) are applicable within state waters, defined as less than 3 nautical miles from shore. Therefore, a majority of research by the EPA addresses near-shore coastal waters within 3 nautical miles, estuaries and lakes where applicable water quality regulation could be implemented. Policy makers and environmental managers in EPA’s program and regional offices need tools enabling them to assess the sustainability of watershed ecosystems, and the services they provide, under current and future land use practices. The typical 1km resolution and current Case 1 algorithms of SeaWiFS, MODIS, and VIIRS provide limited assessments of near-shore coastal waters, estuaries and lakes. It has proven difficult to adequately resolve and derive products in smaller estuaries or waters in proxim

  8. Building automatic customer complaints filtering application based on Twitter in Bahasa Indonesia

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Siregar, R. P.; Rahmat, R. F.; Amalia, A.

    2018-03-01

    Twitter has become a media to provide communication between a company with its customers. The average number of Twitter active users monthly is 330 million. A lot of companies realize the potential of Twitter to establish good relationship with their customers. Therefore, they usually have one official Twitter account to act as customer care division. In Indonesia, one of the company that utilizes the potential of Twitter to reach their customers is PT Telkom. PT Telkom has an official customer service account (called @TelkomCare) to receive customers’ problem. However, because of this account is open for public, Twitter users might post all kind of messages (not only complaints) to Telkom Care account. This leads to a problem that the Telkom Care account contains not only the customer complaints but also compliment and ordinary message. Furthermore, the complaints should be distributed to relevant division such as “Indihome”, “Telkomsel”, “UseeTV”, and “Telepon” based on the content of the message. This research built the application that automatically filter twitter post messages into several pre-defined categories (based on existing divisions) using Naïve Bayes algorithm. This research is done by collecting Twitter message, data cleaning, data pre-processing, training and testing data, and evaluate the classification result. This research yields 97% accuracy to classify Twitter message into the categories mentioned earlier.

  9. Infrared image enhancement using H(infinity) bounds for surveillance applications.

    PubMed

    Qidwai, Uvais

    2008-08-01

    In this paper, two algorithms have been presented to enhance the infrared (IR) images. Using the autoregressive moving average model structure and H(infinity) optimal bounds, the image pixels are mapped from the IR pixel space into normal optical image space, thus enhancing the IR image for improved visual quality. Although H(infinity)-based system identification algorithms are very common now, they are not quite suitable for real-time applications owing to their complexity. However, many variants of such algorithms are possible that can overcome this constraint. Two such algorithms have been developed and implemented in this paper. Theoretical and algorithmic results show remarkable enhancement in the acquired images. This will help in enhancing the visual quality of IR images for surveillance applications.

  10. Bio-Inspired Genetic Algorithms with Formalized Crossover Operators for Robotic Applications.

    PubMed

    Zhang, Jie; Kang, Man; Li, Xiaojuan; Liu, Geng-Yang

    2017-01-01

    Genetic algorithms are widely adopted to solve optimization problems in robotic applications. In such safety-critical systems, it is vitally important to formally prove the correctness when genetic algorithms are applied. This paper focuses on formal modeling of crossover operations that are one of most important operations in genetic algorithms. Specially, we for the first time formalize crossover operations with higher-order logic based on HOL4 that is easy to be deployed with its user-friendly programing environment. With correctness-guaranteed formalized crossover operations, we can safely apply them in robotic applications. We implement our technique to solve a path planning problem using a genetic algorithm with our formalized crossover operations, and the results show the effectiveness of our technique.

  11. A roadmap of clustering algorithms: finding a match for a biomedical application.

    PubMed

    Andreopoulos, Bill; An, Aijun; Wang, Xiaogang; Schroeder, Michael

    2009-05-01

    Clustering is ubiquitously applied in bioinformatics with hierarchical clustering and k-means partitioning being the most popular methods. Numerous improvements of these two clustering methods have been introduced, as well as completely different approaches such as grid-based, density-based and model-based clustering. For improved bioinformatics analysis of data, it is important to match clusterings to the requirements of a biomedical application. In this article, we present a set of desirable clustering features that are used as evaluation criteria for clustering algorithms. We review 40 different clustering algorithms of all approaches and datatypes. We compare algorithms on the basis of desirable clustering features, and outline algorithms' benefits and drawbacks as a basis for matching them to biomedical applications.

  12. Chemical exposure among professional ski waxers--characterization of individual work operations.

    PubMed

    Freberg, Baard Ingegerdsson; Olsen, Raymond; Thorud, Syvert; Ellingsen, Dag G; Daae, Hanne Line; Hersson, Merete; Molander, Paal

    2013-04-01

    Preparation of skis prior to skiing competitions involves several individual work operations and the use of a wide variety of chemically based ski waxing products to improve the performance of the skis, including products used after skiing for wax removal and ski sole cleaning. Modern ski waxes consist mainly of petroleum-derived straight-chain aliphatic hydrocarbons, perfluoro-n-alkanes or polyfluorinated n-alkanes. The wax cleaning products contain solvents such as neat aliphatic hydrocarbons (aliphates) or a mixture with limonene. Different ski waxing work operations can result in contaminated workroom atmospheres. The aim of this study was to assess the chemical exposures related to the individual ski waxing work operations by investigating the specific work operations in controlled model experiments. Four main work operations with potential exposures were identified: (i) application of glider waxes, (ii) scraping and brushing of applied glider waxes, (iii) application of base/grip waxes, and (iv) ski sole cleaning. Aerosol particle masses were sampled using conical samplers equipped with 37-mm PVC, 5-µm pore size filters and cyclones equipped with 37-mm PVC, 0.8-µm pore size filters for the inhalable and the respirable aerosol mass fractions, respectively. For measurements of particle number concentrations, a Scanning Mobility Particle Sizer was used. Mean aerosol particle mass concentrations of 18.6 mg m(-3) and 32.2 mg m(-3) were measured during application of glider wax powders in the respirable and in the inhalable aerosol mass fractions, respectively. Particle number concentration of ~900 000 particles cm(-3) was measured during application of glider wax powder products. Ski sole cleaning with products containing aliphates displayed solvent air concentrations up to 62.5 p.p.m. This study shows that the potential exposure to generated particles during ski waxing and ski preparation is considerable, especially during work using glide wax powders.

  13. Nonlinear Least-Squares Based Method for Identifying and Quantifying Single and Mixed Contaminants in Air with an Electronic Nose

    PubMed Central

    Zhou, Hanying; Homer, Margie L.; Shevade, Abhijit V.; Ryan, Margaret A.

    2006-01-01

    The Jet Propulsion Laboratory has recently developed and built an electronic nose (ENose) using a polymer-carbon composite sensing array. This ENose is designed to be used for air quality monitoring in an enclosed space, and is designed to detect, identify and quantify common contaminants at concentrations in the parts-per-million range. Its capabilities were demonstrated in an experiment aboard the National Aeronautics and Space Administration's Space Shuttle Flight STS-95. This paper describes a modified nonlinear least-squares based algorithm developed to analyze data taken by the ENose, and its performance for the identification and quantification of single gases and binary mixtures of twelve target analytes in clean air. Results from laboratory-controlled events demonstrate the effectiveness of the algorithm to identify and quantify a gas event if concentration exceeds the ENose detection threshold. Results from the flight test demonstrate that the algorithm correctly identifies and quantifies all registered events (planned or unplanned, as singles or mixtures) with no false positives and no inconsistencies with the logged events and the independent analysis of air samples.

  14. Sifting Through SDO's AIA Cosmic Ray Hits to Find Treasure

    NASA Astrophysics Data System (ADS)

    Kirk, M. S.; Thompson, B. J.; Viall, N. M.; Young, P. R.

    2017-12-01

    The Solar Dynamics Observatory's Atmospheric Imaging Assembly (SDO AIA) has revolutionized solar imaging with its high temporal and spatial resolution, unprecedented spatial and temporal coverage, and seven EUV channels. Automated algorithms routinely clean these images to remove cosmic ray intensity spikes as a part of its preprocessing algorithm. We take a novel approach to survey the entire set of AIA "spike" data to identify and group compact brightenings across the entire SDO mission. The AIA team applies a de-spiking algorithm to remove magnetospheric particle impacts on the CCD cameras, but it has been found that compact, intense solar brightenings are often removed as well. We use the spike database to mine the data and form statistics on compact solar brightenings without having to process large volumes of full-disk AIA data. There are approximately 3 trillion "spiked pixels" removed from images over the mission to date. We estimate that 0.001% of those are of solar origin and removed by mistake, giving us a pre-segmented dataset of 30 million events. We explore the implications of these statistics and the physical qualities of the "spikes" of solar origin.

  15. Cleaning verification by air/water impingement

    NASA Technical Reports Server (NTRS)

    Jones, Lisa L.; Littlefield, Maria D.; Melton, Gregory S.; Caimi, Raoul E. B.; Thaxton, Eric A.

    1995-01-01

    This paper will discuss how the Kennedy Space Center intends to perform precision cleaning verification by Air/Water Impingement in lieu of chlorofluorocarbon-113 gravimetric nonvolatile residue analysis (NVR). Test results will be given that demonstrate the effectiveness of the Air/Water system. A brief discussion of the Total Carbon method via the use of a high temperature combustion analyzer will also be given. The necessary equipment for impingement will be shown along with other possible applications of this technology.

  16. Substitution of Wax and Grease Cleaners With Biodegradable Solvents: Phase 1. Part 2

    DTIC Science & Technology

    1989-09-01

    OIL-FREE CO. CITRIC 79 GS-A-67 M-OIL-FREE #1000 ULTRA ULTRA 70 ULTRA 90 ** MADISON BIONICS CHEMERSE ** MAGNUSON PRODUCTS PERMAG #404 ** MAN-GILL...F CONCENTRATE AMBIENT 118 MAGNUSON PRODUCTS PERMAG #404 INGREDIENT --- SODIUM HYDROXIDE APPLICATION -- OIL, GREASE, CARBON METALS -------- FERROUS...CLEANOL 108 2 0Z/GAL MAGNUSON PRODUCTS PERMAG #404 72 12 OZ/GAL MEQQEM MEQQEM-CLEAN 8512 142 10% MEQQEM-CLEAN 8516 145 5% MITCHELL-BRADFORD, INTERN

  17. Comparison of algorithms for blood stain detection applied to forensic hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Messinger, David W.; Mathew, Jobin J.; Dube, Roger R.

    2016-05-01

    Blood stains are among the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Early detection of blood stains is particularly important since the blood reacts physically and chemically with air and materials over time. Accurate identification of blood remnants, including regions that might have been intentionally cleaned, is an important aspect of forensic investigation. Hyperspectral imaging might be a potential method to detect blood stains because it is non-contact and provides substantial spectral information that can be used to identify regions in a scene with trace amounts of blood. The potential complexity of scenes in which such vast violence occurs can be high when the range of scene material types and conditions containing blood stains at a crime scene are considered. Some stains are hard to detect by the unaided eye, especially if a conscious effort to clean the scene has occurred (we refer to these as "latent" blood stains). In this paper we present the initial results of a study of the use of hyperspectral imaging algorithms for blood detection in complex scenes. We describe a hyperspectral imaging system which generates images covering 400 nm - 700 nm visible range with a spectral resolution of 10 nm. Three image sets of 31 wavelength bands were generated using this camera for a simulated indoor crime scene in which blood stains were placed on a T-shirt and walls. To detect blood stains in the scene, Principal Component Analysis (PCA), Subspace Reed Xiaoli Detection (SRXD), and Topological Anomaly Detection (TAD) algorithms were used. Comparison of the three hyperspectral image analysis techniques shows that TAD is most suitable for detecting blood stains and discovering latent blood stains.

  18. 3D measurement by digital photogrammetry

    NASA Astrophysics Data System (ADS)

    Schneider, Carl T.

    1993-12-01

    Photogrammetry is well known in geodetic surveys as aerial photogrammetry or close range applications as architectural photogrammetry. The photogrammetric methods and algorithms combined with digital cameras and digital image processing methods are now introduced for industrial applications as automation and quality control. The presented paper will describe the photogrammetric and digital image processing algorithms and the calibration methods. These algorithms and methods were demonstrated with application examples. These applications are a digital photogrammetric workstation as a mobil multi purpose 3D measuring tool and a tube measuring system as an example for a single purpose tool.

  19. Efficacy of a barrier gel for reducing the development of plaque, calculus, and gingivitis in cats.

    PubMed

    Bellows, Jan; Carithers, Douglas S; Gross, Sheila J

    2012-01-01

    This study was performed to assess the field efficacy of a professional and home-care barrier gel against the development of plaque, calculus, gingival bleeding, and gingivitis in client-owned cats over a 56-day period compared with negative controls. In a randomized, negative-controlled, outcome evaluator-blinded, client-owned animal clinical field study, 31 cats were evaluated to assess if the barrier gel dental product was effective in cats. Following an enrollment-qualification assessment and enrollment of each cat, all cats received a professional dental cleaning, including polishing and irrigation. Following cleaning, a post-cleaning assessment was performed by the evaluator. Then, using a pre-developed randomization schedule, cats were assigned to the treated or control group. The professional version of the barrier gel was applied to the treated group on day 0. The negative-control group patients did not receive any applications of the barrier gel following dental cleaning. Treated-group cats were brought back to the clinic for subsequent applications of the home-care version of the barrier gel, applied by a non-blinded trained assistant. The home-care version product applications began on day 14 and then were applied weekly (days, 21, 28, 35, 42, 49 and 56) through day 56. All cats enrolled in the study underwent full oral examinations and assessments by the blinded evaluator on or about their respective days 28 and 56. At these evaluations, the evaluator performed standardized assessments for plaque, calculus, gingivitis, and gingival bleeding. Numeric scores were assigned for each assessment using predetermined target teeth to ensure consistency. Using these assessment scores, statistical analyses were performed to determine the efficacies against plaque and calculus deposition; additionally, measurements of gingivitis and gingival bleeding were assessed. Change in plaque score from baseline, for all teeth assessed (all 4 canine teeth, and all 4 [corrected] premolar teeth), was significantly (P < 0.05) lower for treated cats than for control cats for both left side average and right side average on day 56. No statistical differences were seen for calculus, gingivitis, or gingival bleeding in this study. In cats with a history of developing plaque, application of the barrier gel dental product following dental cleaning reduced plaque deposition (P < 0.05) compared with control cats.

  20. The development of a scalable parallel 3-D CFD algorithm for turbomachinery. M.S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Luke, Edward Allen

    1993-01-01

    Two algorithms capable of computing a transonic 3-D inviscid flow field about rotating machines are considered for parallel implementation. During the study of these algorithms, a significant new method of measuring the performance of parallel algorithms is developed. The theory that supports this new method creates an empirical definition of scalable parallel algorithms that is used to produce quantifiable evidence that a scalable parallel application was developed. The implementation of the parallel application and an automated domain decomposition tool are also discussed.

  1. The small-scale turbulent dynamo in smoothed particle magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Tricco, T. S.; Price, D. J.; Federrath, C.

    2016-05-01

    Supersonic turbulence is believed to be at the heart of star formation. We have performed smoothed particle magnetohydrodynamics (SPMHD) simulations of the small- scale dynamo amplification of magnetic fields in supersonic turbulence. The calculations use isothermal gas driven at rms velocity of Mach 10 so that conditions are representative of starforming molecular clouds in the Milky Way. The growth of magnetic energy is followed for 10 orders in magnitude until it reaches saturation, a few percent of the kinetic energy. The results of our dynamo calculations are compared with results from grid-based methods, finding excellent agreement on their statistics and their qualitative behaviour. The simulations utilise the latest algorithmic developments we have developed, in particular, a new divergence cleaning approach to maintain the solenoidal constraint on the magnetic field and a method to reduce the numerical dissipation of the magnetic shock capturing scheme. We demonstrate that our divergence cleaning method may be used to achieve ∇ • B = 0 to machine precision, albeit at significant computational expense.

  2. 75 FR 7475 - Agency Information Collection Activity; Proposed Collection; Comment Request; Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ...; Information Collection Request for Application for Sustainable Water Leadership Program AGENCY: Environmental...: Application for Sustainable Water Leadership Program (formerly named the Annual National Clean Water Act... infrastructure initiatives and is now called the Sustainable Water Leadership Program. The Sustainable Water...

  3. 40 CFR 35.920 - Grant application.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 1 2014-07-01 2014-07-01 false Grant application. 35.920 Section 35.920 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GRANTS AND OTHER FEDERAL ASSISTANCE STATE AND LOCAL ASSISTANCE Grants for Construction of Treatment Works-Clean Water Act § 35.920 Grant...

  4. 40 CFR 35.920 - Grant application.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 1 2013-07-01 2013-07-01 false Grant application. 35.920 Section 35.920 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GRANTS AND OTHER FEDERAL ASSISTANCE STATE AND LOCAL ASSISTANCE Grants for Construction of Treatment Works-Clean Water Act § 35.920 Grant...

  5. 40 CFR 35.920 - Grant application.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 1 2011-07-01 2011-07-01 false Grant application. 35.920 Section 35.920 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GRANTS AND OTHER FEDERAL ASSISTANCE STATE AND LOCAL ASSISTANCE Grants for Construction of Treatment Works-Clean Water Act § 35.920 Grant...

  6. 40 CFR 35.920 - Grant application.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 1 2012-07-01 2012-07-01 false Grant application. 35.920 Section 35.920 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GRANTS AND OTHER FEDERAL ASSISTANCE STATE AND LOCAL ASSISTANCE Grants for Construction of Treatment Works-Clean Water Act § 35.920 Grant...

  7. 40 CFR 35.920 - Grant application.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Grant application. 35.920 Section 35.920 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY GRANTS AND OTHER FEDERAL ASSISTANCE STATE AND LOCAL ASSISTANCE Grants for Construction of Treatment Works-Clean Water Act § 35.920 Grant...

  8. 40 CFR 136.1 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) GUIDELINES... the waste constituent specified is required to be measured for: (1) An application submitted to the Administrator, or to a State having an approved NPDES program for a permit under section 402 of the Clean Water...

  9. 40 CFR 122.22 - Signatories to permit applications and reports (applicable to State programs, see § 123.25).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... initiating and directing other comprehensive measures to assure long term environmental compliance with....), Clean Air Act (42 U.S.C. 7401 et seq.), Resource Conservation and Recovery Act (42 U.S.C. 6901 et seq...

  10. 40 CFR 6.401 - Applicability.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... ENVIRONMENTAL POLICY ACT AND ASSESSING THE ENVIRONMENTAL EFFECTS ABROAD OF EPA ACTIONS Assessing the Environmental Effects Abroad of EPA Actions § 6.401 Applicability. (a) Administrative actions requiring.... 6925), NPDES permits pursuant to section 402 of the Clean Water Act (33 U.S.C. 1342), and prevention of...

  11. 40 CFR 6.401 - Applicability.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENVIRONMENTAL POLICY ACT AND ASSESSING THE ENVIRONMENTAL EFFECTS ABROAD OF EPA ACTIONS Assessing the Environmental Effects Abroad of EPA Actions § 6.401 Applicability. (a) Administrative actions requiring.... 6925), NPDES permits pursuant to section 402 of the Clean Water Act (33 U.S.C. 1342), and prevention of...

  12. I/O efficient algorithms and applications in geographic information systems

    NASA Astrophysics Data System (ADS)

    Danner, Andrew

    Modern remote sensing methods such a laser altimetry (lidar) and Interferometric Synthetic Aperture Radar (IfSAR) produce georeferenced elevation data at unprecedented rates. Many Geographic Information System (GIS) algorithms designed for terrain modelling applications cannot process these massive data sets. The primary problem is that these data sets are too large to fit in the main internal memory of modern computers and must therefore reside on larger, but considerably slower disks. In these applications, the transfer of data between disk and main memory, or I/O, becomes the primary bottleneck. Working in a theoretical model that more accurately represents this two level memory hierarchy, we can develop algorithms that are I/O-efficient and reduce the amount of disk I/O needed to solve a problem. In this thesis we aim to modernize GIS algorithms and develop a number of I/O-efficient algorithms for processing geographic data derived from massive elevation data sets. For each application, we convert a geographic question to an algorithmic question, develop an I/O-efficient algorithm that is theoretically efficient, implement our approach and verify its performance using real-world data. The applications we consider include constructing a gridded digital elevation model (DEM) from an irregularly spaced point cloud, removing topological noise from a DEM, modeling surface water flow over a terrain, extracting river networks and watershed hierarchies from the terrain, and locating polygons containing query points in a planar subdivision. We initially developed solutions to each of these applications individually. However, we also show how to combine individual solutions to form a scalable geo-processing pipeline that seamlessly solves a sequence of sub-problems with little or no manual intervention. We present experimental results that demonstrate orders of magnitude improvement over previously known algorithms.

  13. Pose estimation for augmented reality applications using genetic algorithm.

    PubMed

    Yu, Ying Kin; Wong, Kin Hong; Chang, Michael Ming Yuen

    2005-12-01

    This paper describes a genetic algorithm that tackles the pose-estimation problem in computer vision. Our genetic algorithm can find the rotation and translation of an object accurately when the three-dimensional structure of the object is given. In our implementation, each chromosome encodes both the pose and the indexes to the selected point features of the object. Instead of only searching for the pose as in the existing work, our algorithm, at the same time, searches for a set containing the most reliable feature points in the process. This mismatch filtering strategy successfully makes the algorithm more robust under the presence of point mismatches and outliers in the images. Our algorithm has been tested with both synthetic and real data with good results. The accuracy of the recovered pose is compared to the existing algorithms. Our approach outperformed the Lowe's method and the other two genetic algorithms under the presence of point mismatches and outliers. In addition, it has been used to estimate the pose of a real object. It is shown that the proposed method is applicable to augmented reality applications.

  14. 21 CFR 884.1640 - Culdoscope and accessories.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... instruments include: lens cleaning brush, biopsy brush, clip applier (without clips), applicator, cannula... (noninflatable), snare, stylet, forceps, dissector, mechanical (noninflatable) scissors, and suction/irrigation...

  15. 21 CFR 884.1640 - Culdoscope and accessories.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... instruments include: lens cleaning brush, biopsy brush, clip applier (without clips), applicator, cannula... (noninflatable), snare, stylet, forceps, dissector, mechanical (noninflatable) scissors, and suction/irrigation...

  16. 21 CFR 884.1640 - Culdoscope and accessories.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... instruments include: lens cleaning brush, biopsy brush, clip applier (without clips), applicator, cannula... (noninflatable), snare, stylet, forceps, dissector, mechanical (noninflatable) scissors, and suction/irrigation...

  17. 40 CFR 446.10 - Applicability; description of the oil-base solvent wash paint subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CATEGORY Oil-Base Solvent Wash Paint Subcategory § 446.10 Applicability; description of the oil-base... the production of oil-base paint where the tank cleaning is performed using solvents. When a plant is... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Applicability; description of the oil...

  18. 40 CFR 446.10 - Applicability; description of the oil-base solvent wash paint subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CATEGORY Oil-Base Solvent Wash Paint Subcategory § 446.10 Applicability; description of the oil-base... the production of oil-base paint where the tank cleaning is performed using solvents. When a plant is... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Applicability; description of the oil...

  19. 40 CFR 446.10 - Applicability; description of the oil-base solvent wash paint subcategory.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CATEGORY Oil-Base Solvent Wash Paint Subcategory § 446.10 Applicability; description of the oil-base... the production of oil-base paint where the tank cleaning is performed using solvents. When a plant is... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Applicability; description of the oil...

  20. The latest developments and outlook for hydrogen liquefaction technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ohlig, K.; Decker, L.

    2014-01-29

    Liquefied hydrogen is presently mainly used for space applications and the semiconductor industry. While clean energy applications, for e.g. the automotive sector, currently contribute to this demand with a small share only, their demand may see a significant boost in the next years with the need for large scale liquefaction plants exceeding the current plant sizes by far. Hydrogen liquefaction for small scale plants with a maximum capacity of 3 tons per day (tpd) is accomplished with a Brayton refrigeration cycle using helium as refrigerant. This technology is characterized by low investment costs but lower process efficiency and hence highermore » operating costs. For larger plants, a hydrogen Claude cycle is used, characterized by higher investment but lower operating costs. However, liquefaction plants meeting the potentially high demand in the clean energy sector will need further optimization with regard to energy efficiency and hence operating costs. The present paper gives an overview of the currently applied technologies, including their thermodynamic and technical background. Areas of improvement are identified to derive process concepts for future large scale hydrogen liquefaction plants meeting the needs of clean energy applications with optimized energy efficiency and hence minimized operating costs. Compared to studies in this field, this paper focuses on application of new technology and innovative concepts which are either readily available or will require short qualification procedures. They will hence allow implementation in plants in the close future.« less

Top