Sample records for efficient importance sampling

  1. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  2. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only effective sample size. Here, we evaluate proposals in the coalescent literature, to discover that the order of efficiency among the three importance sampling schemes changes when one considers running time as well as effective sample size. We also describe a computational technique called "just-in-time delegation" available to improve the trade-off between running time and precision by constructing improved importance sampling schemes from existing ones. Thus, our systems approach is a potential solution to the "2(8) programs problem" highlighted by Felsenstein, because it provides the flexibility to include or exclude various features of similar coalescent models or importance sampling schemes.

  3. State-dependent biasing method for importance sampling in the weighted stochastic simulation algorithm.

    PubMed

    Roh, Min K; Gillespie, Dan T; Petzold, Linda R

    2010-11-07

    The weighted stochastic simulation algorithm (wSSA) was developed by Kuwahara and Mura [J. Chem. Phys. 129, 165101 (2008)] to efficiently estimate the probabilities of rare events in discrete stochastic systems. The wSSA uses importance sampling to enhance the statistical accuracy in the estimation of the probability of the rare event. The original algorithm biases the reaction selection step with a fixed importance sampling parameter. In this paper, we introduce a novel method where the biasing parameter is state-dependent. The new method features improved accuracy, efficiency, and robustness.

  4. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  5. Comparison of Relative Bias, Precision, and Efficiency of Sampling Methods for Natural Enemies of Soybean Aphid (Hemiptera: Aphididae).

    PubMed

    Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W

    2015-06-01

    Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Convergence and Efficiency of Adaptive Importance Sampling Techniques with Partial Biasing

    NASA Astrophysics Data System (ADS)

    Fort, G.; Jourdain, B.; Lelièvre, T.; Stoltz, G.

    2018-04-01

    We propose a new Monte Carlo method to efficiently sample a multimodal distribution (known up to a normalization constant). We consider a generalization of the discrete-time Self Healing Umbrella Sampling method, which can also be seen as a generalization of well-tempered metadynamics. The dynamics is based on an adaptive importance technique. The importance function relies on the weights (namely the relative probabilities) of disjoint sets which form a partition of the space. These weights are unknown but are learnt on the fly yielding an adaptive algorithm. In the context of computational statistical physics, the logarithm of these weights is, up to an additive constant, the free-energy, and the discrete valued function defining the partition is called the collective variable. The algorithm falls into the general class of Wang-Landau type methods, and is a generalization of the original Self Healing Umbrella Sampling method in two ways: (i) the updating strategy leads to a larger penalization strength of already visited sets in order to escape more quickly from metastable states, and (ii) the target distribution is biased using only a fraction of the free-energy, in order to increase the effective sample size and reduce the variance of importance sampling estimators. We prove the convergence of the algorithm and analyze numerically its efficiency on a toy example.

  7. Extended Importance Sampling for Reliability Analysis under Evidence Theory

    NASA Astrophysics Data System (ADS)

    Yuan, X. K.; Chen, B.; Zhang, B. Q.

    2018-05-01

    In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.

  8. SPATIALLY-BALANCED SAMPLING OF NATURAL RESOURCES IN THE PRESENCE OF FRAME IMPERFECTIONS

    EPA Science Inventory

    The spatial distribution of a natural resource is an important consideration in designing an efficient survey or monitoring program for the resource. Generally, samples that are more or less evenly dispersed over the extent of the resource will be more efficient than simple rando...

  9. Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Guyer, T.; Stringfellow, G. B.

    1982-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.

  10. Optimal auxiliary-covariate-based two-phase sampling design for semiparametric efficient estimation of a mean or mean difference, with application to clinical trials.

    PubMed

    Gilbert, Peter B; Yu, Xuesong; Rotnitzky, Andrea

    2014-03-15

    To address the objective in a clinical trial to estimate the mean or mean difference of an expensive endpoint Y, one approach employs a two-phase sampling design, wherein inexpensive auxiliary variables W predictive of Y are measured in everyone, Y is measured in a random sample, and the semiparametric efficient estimator is applied. This approach is made efficient by specifying the phase two selection probabilities as optimal functions of the auxiliary variables and measurement costs. While this approach is familiar to survey samplers, it apparently has seldom been used in clinical trials, and several novel results practicable for clinical trials are developed. We perform simulations to identify settings where the optimal approach significantly improves efficiency compared to approaches in current practice. We provide proofs and R code. The optimality results are developed to design an HIV vaccine trial, with objective to compare the mean 'importance-weighted' breadth (Y) of the T-cell response between randomized vaccine groups. The trial collects an auxiliary response (W) highly predictive of Y and measures Y in the optimal subset. We show that the optimal design-estimation approach can confer anywhere between absent and large efficiency gain (up to 24 % in the examples) compared to the approach with the same efficient estimator but simple random sampling, where greater variability in the cost-standardized conditional variance of Y given W yields greater efficiency gains. Accurate estimation of E[Y | W] is important for realizing the efficiency gain, which is aided by an ample phase two sample and by using a robust fitting method. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Optimal Auxiliary-Covariate Based Two-Phase Sampling Design for Semiparametric Efficient Estimation of a Mean or Mean Difference, with Application to Clinical Trials

    PubMed Central

    Gilbert, Peter B.; Yu, Xuesong; Rotnitzky, Andrea

    2014-01-01

    To address the objective in a clinical trial to estimate the mean or mean difference of an expensive endpoint Y, one approach employs a two-phase sampling design, wherein inexpensive auxiliary variables W predictive of Y are measured in everyone, Y is measured in a random sample, and the semi-parametric efficient estimator is applied. This approach is made efficient by specifying the phase-two selection probabilities as optimal functions of the auxiliary variables and measurement costs. While this approach is familiar to survey samplers, it apparently has seldom been used in clinical trials, and several novel results practicable for clinical trials are developed. Simulations are performed to identify settings where the optimal approach significantly improves efficiency compared to approaches in current practice. Proofs and R code are provided. The optimality results are developed to design an HIV vaccine trial, with objective to compare the mean “importance-weighted” breadth (Y) of the T cell response between randomized vaccine groups. The trial collects an auxiliary response (W) highly predictive of Y, and measures Y in the optimal subset. We show that the optimal design-estimation approach can confer anywhere between absent and large efficiency gain (up to 24% in the examples) compared to the approach with the same efficient estimator but simple random sampling, where greater variability in the cost-standardized conditional variance of Y given W yields greater efficiency gains. Accurate estimation of E[Y∣W] is important for realizing the efficiency gain, which is aided by an ample phase-two sample and by using a robust fitting method. PMID:24123289

  12. General statistical considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eberhardt, L L; Gilbert, R O

    From NAEG plutonium environmental studies program meeting; Las Vegas, Nevada, USA (2 Oct 1973). The high sampling variability encountered in environmental plutonium studies along with high analytical costs makes it very important that efficient soil sampling plans be used. However, efficient sampling depends on explicit and simple statements of the objectives of the study. When there are multiple objectives it may be difficult to devise a wholly suitable sampling scheme. Sampling for long-term changes in plutonium concentration in soils may also be complex and expensive. Further attention to problems associated with compositing samples is recommended, as is the consistent usemore » of random sampling as a basic technique. (auth)« less

  13. Quantitative Analysis of Defects in Silicon. [to predict energy conversion efficiency of silicon samples for solar cells

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Qidwai, H. A.; Bruce, T.

    1979-01-01

    The evaluation and prediction of the conversion efficiency for a variety of silicon samples with differences in structural defects, such as grain boundaries, twin boundaries, precipitate particles, dislocations, etc. are discussed. Quantitative characterization of these structural defects, which were revealed by etching the surface of silicon samples, is performed by using an image analyzer. Due to different crystal growth and fabrication techniques the various types of silicon contain a variety of trace impurity elements and structural defects. The two most important criteria in evaluating the various silicon types for solar cell applications are cost and conversion efficiency.

  14. Implications of sampling design and sample size for national carbon accounting systems.

    PubMed

    Köhl, Michael; Lister, Andrew; Scott, Charles T; Baldauf, Thomas; Plugge, Daniel

    2011-11-08

    Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of earth-observation data and in-situ field assessments as data sources. We compared the cost-efficiency of four different sampling design alternatives (simple random sampling, regression estimators, stratified sampling, 2-phase sampling with regression estimators) that have been proposed in the scope of REDD. Three of the design alternatives provide for a combination of in-situ and earth-observation data. Under different settings of remote sensing coverage, cost per field plot, cost of remote sensing imagery, correlation between attributes quantified in remote sensing and field data, as well as population variability and the percent standard error over total survey cost was calculated. The cost-efficiency of forest carbon stock assessments is driven by the sampling design chosen. Our results indicate that the cost of remote sensing imagery is decisive for the cost-efficiency of a sampling design. The variability of the sample population impairs cost-efficiency, but does not reverse the pattern of cost-efficiency of the individual design alternatives. Our results clearly indicate that it is important to consider cost-efficiency in the development of forest carbon stock assessments and the selection of remote sensing techniques. The development of MRV-systems for REDD need to be based on a sound optimization process that compares different data sources and sampling designs with respect to their cost-efficiency. This helps to reduce the uncertainties related with the quantification of carbon stocks and to increase the financial benefits from adopting a REDD regime.

  15. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  16. Field efficiency and bias of snag inventory methods

    Treesearch

    Robert S. Kenning; Mark J. Ducey; John C. Brissette; Jeffery H. Gove

    2005-01-01

    Snags and cavity trees are important components of forests, but can be difficult to inventory precisely and are not always included in inventories because of limited resources. We tested the application of N-tree distance sampling as a time-saving snag sampling method and compared N-tree distance sampling to fixed-area sampling and modified horizontal line sampling in...

  17. Vortex focusing of ions produced in corona discharge.

    PubMed

    Kolomiets, Yuri N; Pervukhin, Viktor V

    2013-06-15

    Completeness of the ion transportation into an analytical path defines the efficiency of ionization analysis techniques. This is of particular importance for atmospheric pressure ionization sources like corona discharge, electrospray, ionization with radioactive ((3)H, (63)Ni) isotopes that produce nonuniform spatial distribution of sample ions. The available methods of sample ion focusing are either efficient at reduced pressure (~1Torr) or feature high sample losses. This paper deals with experimental research into atmospheric pressure focusing of unipolar (positive) ions using a highly swirled air stream with a well-defined vortex core. Effects of electrical fields from corona needle and inlet capillary of mass spectrometer on collection efficiency is considered. We used a corona discharge to produce an ionized unipolar sample. It is shown experimentally that with an electrical field barrier efficient transportation and focusing of an ionized sample are possible only when a metal plate restricting the stream and provided with an opening covered with a grid is used. This gives a five-fold increase of the transportation efficiency. It is shown that the electric field barrier in the vortex sampling region reduces the efficiency of remote ionized sample transportation two times. The difference in the efficiency of light ion focusing observed may be explained by a high mobility and a significant effect of the electric field barrier upon them. It is possible to conclude based on the experimental data that the presence of the field barrier narrows considerably (more than by one and half) the region of the vortex sample ion focusing. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Simple street tree sampling

    Treesearch

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  19. An Efficient MCMC Algorithm to Sample Binary Matrices with Fixed Marginals

    ERIC Educational Resources Information Center

    Verhelst, Norman D.

    2008-01-01

    Uniform sampling of binary matrices with fixed margins is known as a difficult problem. Two classes of algorithms to sample from a distribution not too different from the uniform are studied in the literature: importance sampling and Markov chain Monte Carlo (MCMC). Existing MCMC algorithms converge slowly, require a long burn-in period and yield…

  20. Multi-species attributes as the condition for adaptive sampling of rare species using two-stage sequential sampling with an auxiliary variable

    USGS Publications Warehouse

    Panahbehagh, B.; Smith, D.R.; Salehi, M.M.; Hornbach, D.J.; Brown, D.J.; Chan, F.; Marinova, D.; Anderssen, R.S.

    2011-01-01

    Assessing populations of rare species is challenging because of the large effort required to locate patches of occupied habitat and achieve precise estimates of density and abundance. The presence of a rare species has been shown to be correlated with presence or abundance of more common species. Thus, ecological community richness or abundance can be used to inform sampling of rare species. Adaptive sampling designs have been developed specifically for rare and clustered populations and have been applied to a wide range of rare species. However, adaptive sampling can be logistically challenging, in part, because variation in final sample size introduces uncertainty in survey planning. Two-stage sequential sampling (TSS), a recently developed design, allows for adaptive sampling, but avoids edge units and has an upper bound on final sample size. In this paper we present an extension of two-stage sequential sampling that incorporates an auxiliary variable (TSSAV), such as community attributes, as the condition for adaptive sampling. We develop a set of simulations to approximate sampling of endangered freshwater mussels to evaluate the performance of the TSSAV design. The performance measures that we are interested in are efficiency and probability of sampling a unit occupied by the rare species. Efficiency measures the precision of population estimate from the TSSAV design relative to a standard design, such as simple random sampling (SRS). The simulations indicate that the density and distribution of the auxiliary population is the most important determinant of the performance of the TSSAV design. Of the design factors, such as sample size, the fraction of the primary units sampled was most important. For the best scenarios, the odds of sampling the rare species was approximately 1.5 times higher for TSSAV compared to SRS and efficiency was as high as 2 (i.e., variance from TSSAV was half that of SRS). We have found that design performance, especially for adaptive designs, is often case-specific. Efficiency of adaptive designs is especially sensitive to spatial distribution. We recommend that simulations tailored to the application of interest are highly useful for evaluating designs in preparation for sampling rare and clustered populations.

  1. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions. II. A simplified implementation.

    PubMed

    Tao, Guohua; Miller, William H

    2012-09-28

    An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.

  2. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    NASA Astrophysics Data System (ADS)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  3. Assessing overall, technical, and scale efficiency among home health care agencies.

    PubMed

    Valdmanis, Vivian G; Rosko, Michael D; Leleu, Hervé; Mukamel, Dana B

    2017-06-01

    While home health care agencies (HHAs) play a vital role in the production of health, little research has been performed gauging their efficiency. Employing a robust approach to data envelopment analysis (DEA) we assessed overall, technical, and scale efficiency on a nationwide sample of HHAs. After deriving the three efficiency measures, we regressed these scores on a variety of environmental factors. We found that HHAs, on average, could proportionally reduce inputs by 28 % (overall efficiency), 23 % (technical efficiency) and 6 % (scale efficiency). For-profit ownership was positively associated with improvements in overall efficiency and technical efficiency and chain ownership was positively associated with global efficiency. There were also state-by-state variations on all the efficiency measures. As home health becomes an increasingly important player in the health care system, and its share of national health expenditures increases, it has become important to understand the cost structure of the industry and the potential for efficiencies. Therefore, further research is recommended as this sector continues to grow.

  4. SPATIALLY-BALANCED SAMPLING OF NATURAL RESOURCES

    EPA Science Inventory

    The spatial distribution of a natural resource is an important consideration in designing an efficient survey or monitoring program for the resource. Generally, sample sites that are spatially-balanced, that is, more or less evenly dispersed over the extent of the resource, will ...

  5. A practical modification of horizontal line sampling for snag and cavity tree inventory

    Treesearch

    M. J. Ducey; G. J. Jordan; J. H. Gove; H. T. Valentine

    2002-01-01

    Snags and cavity trees are important structural features in forests, but they are often sparsely distributed, making efficient inventories problematic. We present a straightforward modification of horizontal line sampling designed to facilitate inventory of these features while remaining compatible with commonly employed sampling methods for the living overstory. The...

  6. Mosquito Species (Diptera: Culicidae) Diversity from Ovitraps in a Mesoamerican Tropical Rainforest.

    PubMed

    Chaverri, Luis Guillermo; Dillenbeck, Claire; Lewis, Devon; Rivera, Cindy; Romero, Luis Mario; Chaves, Luis Fernando

    2018-05-04

    Mosquito sampling using efficient traps that can assess species diversity and/or presence of dominant vectors is important for understanding the entomological risk of mosquito-borne disease transmission. Here, we present results from a survey of mosquito species sampled with ovitraps in a neotropical rainforest of Costa Rica. We found the method to be an efficient sampling tool. With a total sampling effort of 29 traps, we collected 157 fourth-instar larvae and three pupae belonging to eight mosquito taxonomic units (seven species and individuals from a homogenous taxonomic unit identified to the genus level). In our samples, we found two medically important species, Sabethes chloropterus (Humboldt) and Trichoprosopon digitatum (Rondani). The former is a proven vector of Yellow Fever in sylvatic environments and the later has been found infected with several arboviruses. We also found that mosquito species abundance and diversity increased with canopy cover and in environments where leaf litter dominated the ground cover. Finally, our results suggest that ovitraps have a great potential for systematic sampling in longitudinal and cross-sectional ecological "semi-field" studies in neotropical settings.

  7. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio.

    PubMed

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population's sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns.

  8. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio

    PubMed Central

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population’s sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns. PMID:27441554

  9. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  10. VARIANCE ESTIMATION FOR SPATIALLY BALANCED SAMPLES OF ENVIRONMENTAL RESOURCES

    EPA Science Inventory

    The spatial distribution of a natural resource is an important consideration in designing an efficient survey or monitoring program for the resource. We review a unified strategy for designing probability samples of discrete, finite resource populations, such as lakes within som...

  11. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  12. Lipidic cubic phase injector is a viable crystal delivery system for time-resolved serial crystallography

    DOE PAGES

    Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; ...

    2016-08-22

    Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less

  13. Lipidic cubic phase injector is a viable crystal delivery system for time-resolved serial crystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett

    Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less

  14. Lipidic cubic phase injector is a viable crystal delivery system for time-resolved serial crystallography

    PubMed Central

    Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; Gati, Cornelius; Kimura, Tetsunari; Milne, Christopher; Milathianaki, Despina; Kubo, Minoru; Wu, Wenting; Conrad, Chelsie; Coe, Jesse; Bean, Richard; Zhao, Yun; Båth, Petra; Dods, Robert; Harimoorthy, Rajiv; Beyerlein, Kenneth R.; Rheinberger, Jan; James, Daniel; DePonte, Daniel; Li, Chufeng; Sala, Leonardo; Williams, Garth J.; Hunter, Mark S.; Koglin, Jason E.; Berntsen, Peter; Nango, Eriko; Iwata, So; Chapman, Henry N.; Fromme, Petra; Frank, Matthias; Abela, Rafael; Boutet, Sébastien; Barty, Anton; White, Thomas A.; Weierstall, Uwe; Spence, John; Neutze, Richard; Schertler, Gebhard; Standfuss, Jörg

    2016-01-01

    Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within the crystal lattice is confirmed by time-resolved visible absorption spectroscopy. This study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX. PMID:27545823

  15. Reduced Sampling Size with Nanopipette for Tapping-Mode Scanning Probe Electrospray Ionization Mass Spectrometry Imaging

    PubMed Central

    Kohigashi, Tsuyoshi; Otsuka, Yoichi; Shimazu, Ryo; Matsumoto, Takuya; Iwata, Futoshi; Kawasaki, Hideya; Arakawa, Ryuichi

    2016-01-01

    Mass spectrometry imaging (MSI) with ambient sampling and ionization can rapidly and easily capture the distribution of chemical components in a solid sample. Because the spatial resolution of MSI is limited by the size of the sampling area, reducing sampling size is an important goal for high resolution MSI. Here, we report the first use of a nanopipette for sampling and ionization by tapping-mode scanning probe electrospray ionization (t-SPESI). The spot size of the sampling area of a dye molecular film on a glass substrate was decreased to 6 μm on average by using a nanopipette. On the other hand, ionization efficiency increased with decreasing solvent flow rate. Our results indicate the compatibility between a reduced sampling area and the ionization efficiency using a nanopipette. MSI of micropatterns of ink on a glass and a polymer substrate were also demonstrated. PMID:28101441

  16. A preliminary investigation of sleep quality in functional neurological disorders: Poor sleep appears common, and is associated with functional impairment.

    PubMed

    Graham, Christopher D; Kyle, Simon D

    2017-07-15

    Functional neurological disorders (FND) are disabling conditions for which there are few empirically-supported treatments. Disturbed sleep appears to be part of the FND context; however, the clinical importance of sleep disturbance (extent, characteristics and impact) remains largely unknown. We described sleep quality in two samples, and investigated the relationship between sleep and FND-related functional impairment. We included a sample recruited online via patient charities (N=205) and a consecutive clinical sample (N=20). Participants completed validated measures of sleep quality and sleep characteristics (e.g. total sleep time, sleep efficiency), mood, and FND-related functional impairment. Poor sleep was common in both samples (89% in the clinical range), which was characterised by low sleep efficiency (M=65.40%) and low total sleep time (M=6.05h). In regression analysis, sleep quality was negatively associated with FND-related functional impairment, accounting for 16% of the variance and remaining significant after the introduction of mood variables. These preliminary analyses suggest that subjective sleep disturbance (low efficiency, short sleep) is common in FND. Sleep quality was negatively associated with the functional impairment attributed to FND, independent of depression. Therefore, sleep disturbance may be a clinically important feature of FND. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Solid versus Liquid Particle Sampling Efficiency of Three Personal Aerosol Samplers when Facing the Wind

    PubMed Central

    Koehler, Kirsten A.; Anthony, T. Renee; Van Dyke, Michael

    2016-01-01

    The objective of this study was to examine the facing-the-wind sampling efficiency of three personal aerosol samplers as a function of particle phase (solid versus liquid). Samplers examined were the IOM, Button, and a prototype personal high-flow inhalable sampler head (PHISH). The prototype PHISH was designed to interface with the 37-mm closed-face cassette and provide an inhalable sample at 10 l min−1 of flow. Increased flow rate increases the amount of mass collected during a typical work shift and helps to ensure that limits of detection are met, particularly for well-controlled but highly toxic species. Two PHISH prototypes were tested: one with a screened inlet and one with a single-pore open-face inlet. Personal aerosol samplers were tested on a bluff-body disc that was rotated along the facing-the-wind axis to reduce spatiotemporal variability associated with sampling supermicron aerosol in low-velocity wind tunnels. When compared to published data for facing-wind aspiration efficiency for a mouth-breathing mannequin, the IOM oversampled relative to mannequin facing-the-wind aspiration efficiency for all sizes and particle types (solid and liquid). The sampling efficiency of the Button sampler was closer to the mannequin facing-the-wind aspiration efficiency than the IOM for solid particles, but the screened inlet removed most liquid particles, resulting in a large underestimation compared to the mannequin facing-the-wind aspiration efficiency. The open-face PHISH results showed overestimation for solid particles and underestimation for liquid particles when compared to the mannequin facing-the-wind aspiration efficiency. Substantial (and statistically significant) differences in sampling efficiency were observed between liquid and solid particles, particularly for the Button and screened-PHISH, with a majority of aerosol mass depositing on the screened inlets of these samplers. Our results suggest that large droplets have low penetration efficiencies through screened inlets and that particle bounce, for solid particles, is an important determinant of aspiration and sampling efficiencies for samplers with screened inlets. PMID:21965462

  18. An efficient quantum algorithm for spectral estimation

    NASA Astrophysics Data System (ADS)

    Steffens, Adrian; Rebentrost, Patrick; Marvian, Iman; Eisert, Jens; Lloyd, Seth

    2017-03-01

    We develop an efficient quantum implementation of an important signal processing algorithm for line spectral estimation: the matrix pencil method, which determines the frequencies and damping factors of signals consisting of finite sums of exponentially damped sinusoids. Our algorithm provides a quantum speedup in a natural regime where the sampling rate is much higher than the number of sinusoid components. Along the way, we develop techniques that are expected to be useful for other quantum algorithms as well—consecutive phase estimations to efficiently make products of asymmetric low rank matrices classically accessible and an alternative method to efficiently exponentiate non-Hermitian matrices. Our algorithm features an efficient quantum-classical division of labor: the time-critical steps are implemented in quantum superposition, while an interjacent step, requiring much fewer parameters, can operate classically. We show that frequencies and damping factors can be obtained in time logarithmic in the number of sampling points, exponentially faster than known classical algorithms.

  19. Efficiency and productivity measurement of rural township hospitals in China: a bootstrapping data envelopment analysis

    PubMed Central

    Cheng, Zhaohui; Cai, Miao; Tao, Hongbing; He, Zhifei; Lin, Xiaojun; Lin, Haifeng; Zuo, Yuling

    2016-01-01

    Objective Township hospitals (THs) are important components of the three-tier rural healthcare system of China. However, the efficiency and productivity of THs have been questioned since the healthcare reform was implemented in 2009. The objective of this study is to analyse the efficiency and productivity changes in THs before and after the reform process. Setting and participants A total of 48 sample THs were selected from the Xiaogan Prefecture in Hubei Province from 2008 to 2014. Outcome measures First, bootstrapping data envelopment analysis (DEA) was performed to estimate the technical efficiency (TE), pure technical efficiency (PTE) and scale efficiency (SE) of the sample THs during the period. Second, the bootstrapping Malmquist productivity index was used to calculate the productivity changes over time. Results The average TE, PTE and SE of the sample THs over the 7-year period were 0.5147, 0.6373 and 0.7080, respectively. The average TE and PTE increased from 2008 to 2012 but declined considerably after 2012. In general, the sample THs experienced a negative shift in productivity from 2008 to 2014. The negative change was 2.14%, which was attributed to a 23.89% decrease in technological changes (TC). The sample THs experienced a positive productivity shift from 2008 to 2012 but experienced deterioration from 2012 to 2014. Conclusions There was considerable space for TE improvement in the sample THs since the average TE was relatively low. From 2008 to 2014, the sample THs experienced a decrease in productivity, and the adverse alteration in TC should be emphasised. In the context of healthcare reform, the factors that influence TE and productivity of THs are complex. Results suggest that numerous quantitative and qualitative studies are necessary to explore the reasons for the changes in TE and productivity. PMID:27836870

  20. Importance of the efficiency of double-stranded DNA formation in cDNA synthesis for the imprecision of microarray expression analysis.

    PubMed

    Thormar, Hans G; Gudmundsson, Bjarki; Eiriksdottir, Freyja; Kil, Siyoen; Gunnarsson, Gudmundur H; Magnusson, Magnus Karl; Hsu, Jason C; Jonsson, Jon J

    2013-04-01

    The causes of imprecision in microarray expression analysis are poorly understood, limiting the use of this technology in molecular diagnostics. Two-dimensional strandness-dependent electrophoresis (2D-SDE) separates nucleic acid molecules on the basis of length and strandness, i.e., double-stranded DNA (dsDNA), single-stranded DNA (ssDNA), and RNA·DNA hybrids. We used 2D-SDE to measure the efficiency of cDNA synthesis and its importance for the imprecision of an in vitro transcription-based microarray expression analysis. The relative amount of double-stranded cDNA formed in replicate experiments that used the same RNA sample template was highly variable, ranging between 0% and 72% of the total DNA. Microarray experiments showed an inverse relationship between the difference between sample pairs in probe variance and the relative amount of dsDNA. Approximately 15% of probes showed between-sample variation (P < 0.05) when the dsDNA percentage was between 12% and 35%. In contrast, only 3% of probes showed between-sample variation when the dsDNA percentage was 69% and 72%. Replication experiments of the 35% dsDNA and 72% dsDNA samples were used to separate sample variation from probe replication variation. The estimated SD of the sample-to-sample variation and of the probe replicates was lower in 72% dsDNA samples than in 35% dsDNA samples. Variation in the relative amount of double-stranded cDNA synthesized can be an important component of the imprecision in T7 RNA polymerase-based microarray expression analysis. © 2013 American Association for Clinical Chemistry

  1. Adaptive Sampling for Urban Air Quality through Participatory Sensing

    PubMed Central

    Zeng, Yuanyuan; Xiang, Kai

    2017-01-01

    Air pollution is one of the major problems of the modern world. The popularization and powerful functions of smartphone applications enable people to participate in urban sensing to better know about the air problems surrounding them. Data sampling is one of the most important problems that affect the sensing performance. In this paper, we propose an Adaptive Sampling Scheme for Urban Air Quality (AS-air) through participatory sensing. Firstly, we propose to find the pattern rules of air quality according to the historical data contributed by participants based on Apriori algorithm. Based on it, we predict the on-line air quality and use it to accelerate the learning process to choose and adapt the sampling parameter based on Q-learning. The evaluation results show that AS-air provides an energy-efficient sampling strategy, which is adaptive toward the varied outside air environment with good sampling efficiency. PMID:29099766

  2. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966

  3. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  4. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  5. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm.

    PubMed

    Awad, Fahed; Naserllah, Muhammad; Omar, Ammar; Abu-Hantash, Alaa; Al-Taj, Abrar

    2018-01-31

    Localization of access points has become an important research problem due to the wide range of applications it addresses such as dismantling critical security threats caused by rogue access points or optimizing wireless coverage of access points within a service area. Existing proposed solutions have mostly relied on theoretical hypotheses or computer simulation to demonstrate the efficiency of their methods. The techniques that rely on estimating the distance using samples of the received signal strength usually assume prior knowledge of the signal propagation characteristics of the indoor environment in hand and tend to take a relatively large number of uniformly distributed random samples. This paper presents an efficient and practical collaborative approach to detect the location of an access point in an indoor environment without any prior knowledge of the environment. The proposed approach comprises a swarm of wirelessly connected mobile robots that collaboratively and autonomously collect a relatively small number of non-uniformly distributed random samples of the access point's received signal strength. These samples are used to efficiently and accurately estimate the location of the access point. The experimental testing verified that the proposed approach can identify the location of the access point in an accurate and efficient manner.

  6. Collaborative Indoor Access Point Localization Using Autonomous Mobile Robot Swarm

    PubMed Central

    Awad, Fahed; Naserllah, Muhammad; Omar, Ammar; Abu-Hantash, Alaa; Al-Taj, Abrar

    2018-01-01

    Localization of access points has become an important research problem due to the wide range of applications it addresses such as dismantling critical security threats caused by rogue access points or optimizing wireless coverage of access points within a service area. Existing proposed solutions have mostly relied on theoretical hypotheses or computer simulation to demonstrate the efficiency of their methods. The techniques that rely on estimating the distance using samples of the received signal strength usually assume prior knowledge of the signal propagation characteristics of the indoor environment in hand and tend to take a relatively large number of uniformly distributed random samples. This paper presents an efficient and practical collaborative approach to detect the location of an access point in an indoor environment without any prior knowledge of the environment. The proposed approach comprises a swarm of wirelessly connected mobile robots that collaboratively and autonomously collect a relatively small number of non-uniformly distributed random samples of the access point’s received signal strength. These samples are used to efficiently and accurately estimate the location of the access point. The experimental testing verified that the proposed approach can identify the location of the access point in an accurate and efficient manner. PMID:29385042

  7. The Efficiency of Infants' Exploratory Play Is Related to Longer-Term Cognitive Development.

    PubMed

    Muentener, Paul; Herrig, Elise; Schulz, Laura

    2018-01-01

    In this longitudinal study we examined the stability of exploratory play in infancy and its relation to cognitive development in early childhood. We assessed infants' ( N = 130, mean age at enrollment = 12.02 months, SD = 3.5 months; range: 5-19 months) exploratory play four times over 9 months. Exploratory play was indexed by infants' attention to novelty, inductive generalizations, efficiency of exploration, face preferences, and imitative learning. We assessed cognitive development at the fourth visit for the full sample, and again at age three for a subset of the sample ( n = 38). The only measure that was stable over infancy was the efficiency of exploration. Additionally, infants' efficiency score predicted vocabulary size and distinguished at-risk infants recruited from early intervention sites from those not at risk. Follow-up analyses at age three provided additional evidence for the importance of the efficiency measure: more efficient exploration was correlated with higher IQ scores. These results suggest that the efficiency of infants' exploratory play can be informative about longer-term cognitive development.

  8. The Efficiency of Infants' Exploratory Play Is Related to Longer-Term Cognitive Development

    PubMed Central

    Muentener, Paul; Herrig, Elise; Schulz, Laura

    2018-01-01

    In this longitudinal study we examined the stability of exploratory play in infancy and its relation to cognitive development in early childhood. We assessed infants' (N = 130, mean age at enrollment = 12.02 months, SD = 3.5 months; range: 5–19 months) exploratory play four times over 9 months. Exploratory play was indexed by infants' attention to novelty, inductive generalizations, efficiency of exploration, face preferences, and imitative learning. We assessed cognitive development at the fourth visit for the full sample, and again at age three for a subset of the sample (n = 38). The only measure that was stable over infancy was the efficiency of exploration. Additionally, infants' efficiency score predicted vocabulary size and distinguished at-risk infants recruited from early intervention sites from those not at risk. Follow-up analyses at age three provided additional evidence for the importance of the efficiency measure: more efficient exploration was correlated with higher IQ scores. These results suggest that the efficiency of infants' exploratory play can be informative about longer-term cognitive development. PMID:29904360

  9. Field results for line intersect distance sampling of coarse woody debris

    Treesearch

    David L. R. Affleck

    2009-01-01

    A growing recognition of the importance of downed woody materials in forest ecosystem processes and global carbon budgets has sharpened the need for efficient sampling strategies that target this resource. Often the aggregate volume, biomass, or carbon content of the downed wood is of primary interest, making recently developed probability proportional-to-volume...

  10. Percussive Force Magnitude in Permafrost

    NASA Technical Reports Server (NTRS)

    Eustes, A. W., III; Bridgford, E.; Tischler, A.; Wilcox, B. H.

    2000-01-01

    An in-depth look at percussive drilling shows that the transmission efficiency is very important; however, data for percussive drilling in hard rock or permafrost is rarely available or the existing data are very old. Transmission efficiency can be used as a measurement of the transmission of the energy in the piston to the drill steel or bit and from the bit to the rock. Having a plane and centralized impact of the piston on the drill steel can optimize the transmission efficiency from the piston to the drill steel. A transmission efficiency of near 100% between piston and drill steel is possible. The transmission efficiency between bit and rock is dependent upon the interaction within the entire system. The main factors influencing this transmission efficiency are the contact area between cutting structure and surrounding rock (energy loss due to friction heat), damping characteristics of the surrounding rock (energy dampening), and cuttings transport. Some of these parameters are not controllable. To solve the existing void regarding available drilling data, an experiment for gathering energy data in permafrost for percussive drilling was designed. Fifteen artificial permafrost samples were prepared. The samples differed in the grain size distribution to observe a possible influence of the grain size distribution on the drilling performance. The samples were then manually penetrated (with a sledge-hammer) with two different spikes.

  11. Second-harmonic generation in single crystals of 2-(N,N-dimethylamino)-5-nitroacetanilide (DAN) at 1.3 micron

    NASA Astrophysics Data System (ADS)

    Kolinsky, P. V.; Chad, R. J.; Jones, R. J.; Hall, S. R.; Norman, P. A.

    1987-07-01

    Measurements are reported on efficiency phase-matched second-harmonic generation in a single crystal of the organic material 2-(N,N-dimethylamino)-5-nitroacetanilide at the technologically important communications wavelength of 1.3 micron. Using 0.5 mJ pulses, a conversion efficiency of 18 percent has been achieved for a sample 2 mm thick.

  12. Evaluation of Bacillus anthracis and Yersinia pestis sample collection from nonporous surfaces by quantitative real-time PCR.

    PubMed

    Hong-Geller, E; Valdez, Y E; Shou, Y; Yoshida, T M; Marrone, B L; Dunbar, J M

    2010-04-01

    We will validate sample collection methods for recovery of microbial evidence in the event of accidental or intentional release of biological agents into the environment. We evaluated the sample recovery efficiencies of two collection methods - swabs and wipes - for both nonvirulent and virulent strains of Bacillus anthracis and Yersinia pestis from four types of nonporous surfaces: two hydrophilic surfaces, stainless steel and glass, and two hydrophobic surfaces, vinyl and plastic. Sample recovery was quantified using real-time qPCR to assay for intact DNA signatures. We found no consistent difference in collection efficiency between swabs or wipes. Furthermore, collection efficiency was more surface-dependent for virulent strains than nonvirulent strains. For the two nonvirulent strains, collection efficiency was similar between all four surfaces, albeit B. anthracis Sterne exhibited higher levels of recovery compared to Y. pestis A1122. In contrast, recovery of B. anthracis Ames spores and Y. pestis CO92 from the hydrophilic glass or stainless steel surfaces was generally more efficient compared to collection from the hydrophobic vinyl and plastic surfaces. Our results suggest that surface hydrophobicity may play a role in the strength of pathogen adhesion. The surface-dependent collection efficiencies observed with the virulent strains may arise from strain-specific expression of capsular material or other cell surface receptors that alter cell adhesion to specific surfaces. These findings contribute to the validation of standard bioforensics procedures and emphasize the importance of specific strain and surface interactions in pathogen detection.

  13. ZnO/Ag nanocomposite: an efficient catalyst for degradation studies of textile effluents under visible light.

    PubMed

    Saravanan, R; Karthikeyan, N; Gupta, V K; Thirumal, E; Thangadurai, P; Narayanan, V; Stephen, A

    2013-05-01

    Degradation of model organic dye and industry effluent was studied using different weight percentages of Ag into ZnO as a catalyst. In this study, the catalysts were prepared by thermal decomposition method, which was employed for the first time in the preparation of ZnO/Ag nanocomposite catalysts. The physical and chemical properties of the prepared samples were studied using various techniques. The specific surface area, which plays an important role in the photocatalytic degradation, was studied using BET analysis and 10 wt.% Ag into ZnO showed the best degrading efficiency. The optical absorption (UV-vis) and emission (PL) properties of the samples were studied and results suggest better photocatalytic properties for 10 wt.% Ag sample compared to other samples. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.

  14. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  15. Efficiency and productivity measurement of rural township hospitals in China: a bootstrapping data envelopment analysis.

    PubMed

    Cheng, Zhaohui; Cai, Miao; Tao, Hongbing; He, Zhifei; Lin, Xiaojun; Lin, Haifeng; Zuo, Yuling

    2016-11-11

    Township hospitals (THs) are important components of the three-tier rural healthcare system of China. However, the efficiency and productivity of THs have been questioned since the healthcare reform was implemented in 2009. The objective of this study is to analyse the efficiency and productivity changes in THs before and after the reform process. A total of 48 sample THs were selected from the Xiaogan Prefecture in Hubei Province from 2008 to 2014. First, bootstrapping data envelopment analysis (DEA) was performed to estimate the technical efficiency (TE), pure technical efficiency (PTE) and scale efficiency (SE) of the sample THs during the period. Second, the bootstrapping Malmquist productivity index was used to calculate the productivity changes over time. The average TE, PTE and SE of the sample THs over the 7-year period were 0.5147, 0.6373 and 0.7080, respectively. The average TE and PTE increased from 2008 to 2012 but declined considerably after 2012. In general, the sample THs experienced a negative shift in productivity from 2008 to 2014. The negative change was 2.14%, which was attributed to a 23.89% decrease in technological changes (TC). The sample THs experienced a positive productivity shift from 2008 to 2012 but experienced deterioration from 2012 to 2014. There was considerable space for TE improvement in the sample THs since the average TE was relatively low. From 2008 to 2014, the sample THs experienced a decrease in productivity, and the adverse alteration in TC should be emphasised. In the context of healthcare reform, the factors that influence TE and productivity of THs are complex. Results suggest that numerous quantitative and qualitative studies are necessary to explore the reasons for the changes in TE and productivity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  16. GenoCore: A simple and fast algorithm for core subset selection from large genotype datasets.

    PubMed

    Jeong, Seongmun; Kim, Jae-Yoon; Jeong, Soon-Chun; Kang, Sung-Taeg; Moon, Jung-Kyung; Kim, Namshin

    2017-01-01

    Selecting core subsets from plant genotype datasets is important for enhancing cost-effectiveness and to shorten the time required for analyses of genome-wide association studies (GWAS), and genomics-assisted breeding of crop species, etc. Recently, a large number of genetic markers (>100,000 single nucleotide polymorphisms) have been identified from high-density single nucleotide polymorphism (SNP) arrays and next-generation sequencing (NGS) data. However, there is no software available for picking out the efficient and consistent core subset from such a huge dataset. It is necessary to develop software that can extract genetically important samples in a population with coherence. We here present a new program, GenoCore, which can find quickly and efficiently the core subset representing the entire population. We introduce simple measures of coverage and diversity scores, which reflect genotype errors and genetic variations, and can help to select a sample rapidly and accurately for crop genotype dataset. Comparison of our method to other core collection software using example datasets are performed to validate the performance according to genetic distance, diversity, coverage, required system resources, and the number of selected samples. GenoCore selects the smallest, most consistent, and most representative core collection from all samples, using less memory with more efficient scores, and shows greater genetic coverage compared to the other software tested. GenoCore was written in R language, and can be accessed online with an example dataset and test results at https://github.com/lovemun/Genocore.

  17. Statistical properties of mean stand biomass estimators in a LIDAR-based double sampling forest survey design.

    Treesearch

    H.E. Anderson; J. Breidenbach

    2007-01-01

    Airborne laser scanning (LIDAR) can be a valuable tool in double-sampling forest survey designs. LIDAR-derived forest structure metrics are often highly correlated with important forest inventory variables, such as mean stand biomass, and LIDAR-based synthetic regression estimators have the potential to be highly efficient compared to single-stage estimators, which...

  18. Introducing sampling entropy in repository based adaptive umbrella sampling

    NASA Astrophysics Data System (ADS)

    Zheng, Han; Zhang, Yingkai

    2009-12-01

    Determining free energy surfaces along chosen reaction coordinates is a common and important task in simulating complex systems. Due to the complexity of energy landscapes and the existence of high barriers, one widely pursued objective to develop efficient simulation methods is to achieve uniform sampling among thermodynamic states of interest. In this work, we have demonstrated sampling entropy (SE) as an excellent indicator for uniform sampling as well as for the convergence of free energy simulations. By introducing SE and the concentration theorem into the biasing-potential-updating scheme, we have further improved the adaptivity, robustness, and applicability of our recently developed repository based adaptive umbrella sampling (RBAUS) approach [H. Zheng and Y. Zhang, J. Chem. Phys. 128, 204106 (2008)]. Besides simulations of one dimensional free energy profiles for various systems, the generality and efficiency of this new RBAUS-SE approach have been further demonstrated by determining two dimensional free energy surfaces for the alanine dipeptide in gas phase as well as in water.

  19. Fast simulation of packet loss rates in a shared buffer communications switch

    NASA Technical Reports Server (NTRS)

    Chang, Cheng-Shang; Heidelberger, Philip; Shahabuddin, Perwez

    1993-01-01

    This paper describes an efficient technique for estimating, via simulation, the probability of buffer overflows in a queueing model that arises in the analysis of ATM (Asynchronous Transfer Mode) communication switches. There are multiple streams of (autocorrelated) traffic feeding the switch that has a buffer of finite capacity. Each stream is designated as either being of high or low priority. When the queue length reaches a certain threshold, only high priority packets are admitted to the switch's buffer. The problem is to estimate the loss rate of high priority packets. An asymptotically optimal importance sampling approach is developed for this rare event simulation problem. In this approach, the importance sampling is done in two distinct phases. In the first phase, an importance sampling change of measure is used to bring the queue length up to the threshold at which low priority packets get rejected. In the second phase, a different importance sampling change of measure is used to move the queue length from the threshold to the buffer capacity.

  20. Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2005-05-01

    Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.

  1. Paper-based SERS swab for rapid trace detection on real-world surfaces.

    PubMed

    Lee, Chang H; Tian, Limei; Singamaneni, Srikanth

    2010-12-01

    One of the important but often overlooked considerations in the design of surface-enhanced Raman scattering (SERS) substrates for trace detection is the efficiency of sample collection. Conventional designs based on rigid substrates such as silicon, alumina, and glass resist conformal contact with the surface under investigation, making the sample collection inefficient. We demonstrate a novel SERS substrate based on common filter paper adsorbed with gold nanorods, which allows conformal contact with real-world surfaces, thus dramatically enhancing the sample collection efficiency compared to conventional rigid substrates. We demonstrate the detection of trace amounts of analyte (140 pg spread over 4 cm2) by simply swabbing the surface under investigation with the novel SERS substrate. The hierarchical fibrous structure of paper serves as a 3D vasculature for easy uptake and transport of the analytes to the electromagnetic hot spots in the paper. Simple yet highly efficient and cost-effective SERS substrate demonstrated here brings SERS-based trace detection closer to real-world applications.

  2. Globally efficient non-parametric inference of average treatment effects by empirical balancing calibration weighting

    PubMed Central

    Chan, Kwun Chuen Gary; Yam, Sheung Chi Phillip; Zhang, Zheng

    2015-01-01

    Summary The estimation of average treatment effects based on observational data is extremely important in practice and has been studied by generations of statisticians under different frameworks. Existing globally efficient estimators require non-parametric estimation of a propensity score function, an outcome regression function or both, but their performance can be poor in practical sample sizes. Without explicitly estimating either functions, we consider a wide class calibration weights constructed to attain an exact three-way balance of the moments of observed covariates among the treated, the control, and the combined group. The wide class includes exponential tilting, empirical likelihood and generalized regression as important special cases, and extends survey calibration estimators to different statistical problems and with important distinctions. Global semiparametric efficiency for the estimation of average treatment effects is established for this general class of calibration estimators. The results show that efficiency can be achieved by solely balancing the covariate distributions without resorting to direct estimation of propensity score or outcome regression function. We also propose a consistent estimator for the efficient asymptotic variance, which does not involve additional functional estimation of either the propensity score or the outcome regression functions. The proposed variance estimator outperforms existing estimators that require a direct approximation of the efficient influence function. PMID:27346982

  3. Globally efficient non-parametric inference of average treatment effects by empirical balancing calibration weighting.

    PubMed

    Chan, Kwun Chuen Gary; Yam, Sheung Chi Phillip; Zhang, Zheng

    2016-06-01

    The estimation of average treatment effects based on observational data is extremely important in practice and has been studied by generations of statisticians under different frameworks. Existing globally efficient estimators require non-parametric estimation of a propensity score function, an outcome regression function or both, but their performance can be poor in practical sample sizes. Without explicitly estimating either functions, we consider a wide class calibration weights constructed to attain an exact three-way balance of the moments of observed covariates among the treated, the control, and the combined group. The wide class includes exponential tilting, empirical likelihood and generalized regression as important special cases, and extends survey calibration estimators to different statistical problems and with important distinctions. Global semiparametric efficiency for the estimation of average treatment effects is established for this general class of calibration estimators. The results show that efficiency can be achieved by solely balancing the covariate distributions without resorting to direct estimation of propensity score or outcome regression function. We also propose a consistent estimator for the efficient asymptotic variance, which does not involve additional functional estimation of either the propensity score or the outcome regression functions. The proposed variance estimator outperforms existing estimators that require a direct approximation of the efficient influence function.

  4. The great importance of normalization of LC-MS data for highly-accurate non-targeted metabolomics.

    PubMed

    Mizuno, Hajime; Ueda, Kazuki; Kobayashi, Yuta; Tsuyama, Naohiro; Todoroki, Kenichiro; Min, Jun Zhe; Toyo'oka, Toshimasa

    2017-01-01

    The non-targeted metabolomics analysis of biological samples is very important to understand biological functions and diseases. LC combined with electrospray ionization-based MS has been a powerful tool and widely used for metabolomic analyses. However, the ionization efficiency of electrospray ionization fluctuates for various unexpected reasons such as matrix effects and intraday variations of the instrument performances. To remove these fluctuations, normalization methods have been developed. Such techniques include increasing the sensitivity, separating co-eluting components and normalizing the ionization efficiencies. Normalization techniques allow simultaneously correcting of the ionization efficiencies of the detected metabolite peaks and achieving quantitative non-targeted metabolomics. In this review paper, we focused on these normalization methods for non-targeted metabolomics by LC-MS. Copyright © 2016 John Wiley & Sons, Ltd.

  5. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  6. Distance-limited perpendicular distance sampling for coarse woody debris: theory and field results

    Treesearch

    Mark J. Ducey; Micheal S. Williams; Jeffrey H. Gove; Steven Roberge; Robert S. Kenning

    2013-01-01

    Coarse woody debris (CWD) has been identified as an important component in many forest ecosystem processes. Perpendicular distance sampling (PDS) is one of the several efficient new methods that have been proposed for CWD inventory. One drawback of PDS is that the maximum search distance can be very large, especially if CWD diameters are large or the volume factor...

  7. Efficiency of malaise traps and colored pan traps for collecting flower visiting insects from three forested ecosystems

    Treesearch

    Joshua W. Campbell; J.L. Hanula

    2007-01-01

    Pan and Malaise traps have been used widely to sample insect abundance and diversity, but no studies have compared their performance for sampling pollinators in forested ecosystems. Malaise trap design and color of pan traps are important parameters that influence insect pollinator catches. We compared pan trap (blue, yellow, white, and red) and Malaise trap catches...

  8. Sample collection of virulent and non-virulent B. anthracis and Y. pestis for bioforensics analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong-geller, Elizabeth; Valdez, Yolanda E; Shou, Yulin

    2009-01-01

    Validated sample collection methods are needed for recovery of microbial evidence in the event of accidental or intentional release of biological agents into the environment. To address this need, we evaluated the sample recovery efficiencies of two collection methods -- swabs and wipes -- for both non-virulent and virulent strains of B. anthracis and Y. pestis from four types of non-porous surfaces: two hydrophilic surfaces, stainless steel and glass, and two hydrophobic surfaces, vinyl and plastic. Sample recovery was quantified using Real-time qPCR to assay for intact DNA signatures. We found no consistent difference in collection efficiency between swabs ormore » wipes. Furthermore, collection efficiency was more surface-dependent for virulent strains than non-virulent strains. For the two non-virulent strains, B. anthracis Sterne and Y. pestis A1122, collection efficiency was approximately 100% and 1 %, respectively, from all four surfaces. In contrast, recovery of B. anthracis Ames spores and Y. pestis C092 from vinyl and plastic was generally lower compared to collection from glass or stainless steel, suggesting that surface hydrophobicity may playa role in the strength of pathogen adhesion. The surface-dependent collection efficiencies observed with the virulent strains may arise from strain-specific expression of capsular material or other cell surface receptors that alter cell adhesion to specific surfaces. These findings contribute to validation of standard bioforensics procedures and emphasize the importance of specific strain and surface interactions in pathogen detection.« less

  9. Meta-Storms: efficient search for similar microbial communities based on a novel indexing scheme and similarity score for metagenomic data.

    PubMed

    Su, Xiaoquan; Xu, Jian; Ning, Kang

    2012-10-01

    It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable database management and search system to quickly identify similar metagenomic samples from a large pool of samples. ningkang@qibebt.ac.cn Supplementary data are available at Bioinformatics online.

  10. Improved importance sampling technique for efficient simulation of digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  11. Computerized Budget Monitoring.

    ERIC Educational Resources Information Center

    Stein, Julian U.; Rowe, Joe N.

    1989-01-01

    This article discusses the importance of budget monitoring in fiscal management; describes ways in which computerized budget monitoring increases accuracy, efficiency, and flexibility; outlines steps in the budget process; and presents sample reports, generated using the Lotus 1-2-3 spreadsheet and graphics program. (IAH)

  12. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling

    PubMed Central

    Zhou, Fuqun; Zhang, Aining

    2016-01-01

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2–3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests’ features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data. PMID:27792152

  13. Optimal Subset Selection of Time-Series MODIS Images and Sample Data Transfer with Random Forests for Supervised Classification Modelling.

    PubMed

    Zhou, Fuqun; Zhang, Aining

    2016-10-25

    Nowadays, various time-series Earth Observation data with multiple bands are freely available, such as Moderate Resolution Imaging Spectroradiometer (MODIS) datasets including 8-day composites from NASA, and 10-day composites from the Canada Centre for Remote Sensing (CCRS). It is challenging to efficiently use these time-series MODIS datasets for long-term environmental monitoring due to their vast volume and information redundancy. This challenge will be greater when Sentinel 2-3 data become available. Another challenge that researchers face is the lack of in-situ data for supervised modelling, especially for time-series data analysis. In this study, we attempt to tackle the two important issues with a case study of land cover mapping using CCRS 10-day MODIS composites with the help of Random Forests' features: variable importance, outlier identification. The variable importance feature is used to analyze and select optimal subsets of time-series MODIS imagery for efficient land cover mapping, and the outlier identification feature is utilized for transferring sample data available from one year to an adjacent year for supervised classification modelling. The results of the case study of agricultural land cover classification at a regional scale show that using only about a half of the variables we can achieve land cover classification accuracy close to that generated using the full dataset. The proposed simple but effective solution of sample transferring could make supervised modelling possible for applications lacking sample data.

  14. Relative Importance Assigned to Health Care Rationing Principles at the Bedside: Evidence From a Portuguese and Bulgarian Survey.

    PubMed

    Pinho, Micaela Moreira; Pinto Borges, Ana

    Activity was undertaken to develop a Prioritization Scoring Index for Portugal and Bulgaria that weights the importance given to ethical rationing principles that should guide decisions at the bedside. Data from two random samples of 355 Portuguese and 298 Bulgarian members of the public were collected from an online questionnaire. Questions asked about the level of importance given to specific issues related to patient's prioritization criteria. Responses were analyzed quantitatively with the SPSS. In the process of selecting the patient to treat, Portuguese and Bulgarian respondents seem unanimous in giving greater importance to (i) the treatment outcomes, (ii) the severity of illness, (iii) children, and (iv) patients' fragility. In general, Portuguese and Bulgarian respondents allocate more than 50% of the prioritization weight to equity considerations, approximately 35% to efficiency considerations, and 5% to lottery selection. Even so, Bulgarian respondents rate highly the equity and less the efficiency consideration than Portuguese respondents. Although the pursuit of efficiency seems to be valued by respondents, their major concern seems to be with the reduction of inequalities in health.

  15. Trap Type, Chirality of a-Pinene, and Geographic Region Affect Sampling Efficiency of Root and Lower Stem Insects in Pine

    Treesearch

    Nadir Erbilgin; Alex Szele; Kier Dean Klepzig; Kenneth Francis Raffa

    2001-01-01

    Root and lower stem insects cause significant damage to conifers, vector phytopathogenic fungi, and can predispose trees to bark beetle attacks. The development of effective sampling techniques is an important component in managing these cryptic insects. We tested the effects of trap type and stereochemistry of a-pinene, in combination with ethanol, on catches of the...

  16. The experience sampling method: Investigating students' affective experience

    NASA Astrophysics Data System (ADS)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  17. The influence of zeolites fly ash bead/TiO2 composite material surface morphologies on their adsorption and photocatalytic performance

    NASA Astrophysics Data System (ADS)

    Yang, Lu; Wang, Fazhou; Hakki, Amer; Macphee, Donald E.; Liu, Peng; Hu, Shuguang

    2017-01-01

    A low cost zeolite fly ash bead/TiO2 (ZFABT) composite materials with various surface structure features were prepared for describing those structures importance on TiO2 coating, adsorbability and photocatalytic performances. The results indicated that fly ash bead (FAB) surface was significantly altered by the precipitation/growth of secondary zeolite phases after alkali activation, which generates abundant open pores and stacked petal-liked spherical beads (∼2 μm, Sodalite zeolites). More importantly, this porosity increases as activation time was increased from 2 h to 12 h, through the precipitation of sodalite and then Na-P1 (lamellar crystals) and Na-X (octahedral crystals) zeolite structures. Compared to those of unsupported TiO2 or inactivated support/TiO2 samples, all of ZFABT samples exhibited a higher adsorption capacity and photocatalytic efficiency for RhB removal. However, adsorption is not only one factor to influence TiO2 surface reaction, the intraparticle diffusion rate of rhodamine B (RhB) molecules, and light penetration are also important parameters. Alkali activated 4 h ZFABT sample exhibited the highest photocatalytic activity, indicating its pore structure provided a better balance for those parameters to achieve a synergistic adsorption/photocatalytic process. The kinetics model suggested its high intraparticle diffusion rate allowed for more RhB molecules to easily reach the reaction surface, which is more important for high efficiency photocatalysis.

  18. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  19. Integrated evaluation of the performance of a more than seven year old permeable reactive barrier at a site contaminated with chlorinated aliphatic hydrocarbons (CAHs)

    NASA Astrophysics Data System (ADS)

    Muchitsch, Nanna; Van Nooten, Thomas; Bastiaens, Leen; Kjeldsen, Peter

    2011-11-01

    An important issue of concern for permeable reactive iron barriers is the long-term efficiency of the barriers due to the long operational periods required. Mineral precipitation resulting from the anaerobic corrosion of the iron filings and bacteria present in the barrier may play an important role in the long-term performance. An integrated study was performed on the Vapokon permeable reactive barrier (PRB) in Denmark by groundwater and iron core sample characterization. The detailed field groundwater sampling carried out from more than 75 well screens up and downstream the barrier showed a very efficient removal (> 99%) for the most important CAHs (PCE, TCE and 1,1,1-TCA). However, significant formation of cis-DCE within the PRB resulted in an overall insufficient efficiency for cis-DCE removal. The detailed analysis of the upstream groundwater revealed a very heterogeneous spatial distribution of contaminant loading into the PRB, which resulted in that only about a quarter of the barrier system is treating significant loads of CAHs. Laboratory batch experiments using contaminated groundwater from the site and iron material from the core samples revealed that the aged iron material performed equally well as virgin granular iron of the same type based on determined degradation rates despite that parts of the cored iron material were covered by mineral precipitates (especially iron sulfides, carbonate green rust and aragonite). The PCR analysis performed on the iron core samples indicated the presence of a microbial consortium in the barrier. A wide range of species were identified including sulfate and iron reducing bacteria, together with Dehalococcoides and Desulfuromonas species indicating microbial reductive dehalogenation potential. The microbes had a profound effect on the performance of the barrier, as indicated by significant degradation of dichloromethane (which is typically unaffected by zero valent iron) within the barrier.

  20. Communication: importance sampling including path correlation in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Pan, Feng; Tao, Guohua

    2013-03-07

    Full semiclassical (SC) initial value representation (IVR) for time correlation functions involves a double phase space average over a set of two phase points, each of which evolves along a classical path. Conventionally, the two initial phase points are sampled independently for all degrees of freedom (DOF) in the Monte Carlo procedure. Here, we present an efficient importance sampling scheme by including the path correlation between the two initial phase points for the bath DOF, which greatly improves the performance of the SC-IVR calculations for large molecular systems. Satisfactory convergence in the study of quantum coherence in vibrational relaxation has been achieved for a benchmark system-bath model with up to 21 DOF.

  1. Nanoparticle functionalised small-core suspended-core fibre - a novel platform for efficient sensing.

    PubMed

    Doherty, Brenda; Csáki, Andrea; Thiele, Matthias; Zeisberger, Matthias; Schwuchow, Anka; Kobelke, Jens; Fritzsche, Wolfgang; Schmidt, Markus A

    2017-02-01

    Detecting small quantities of specific target molecules is of major importance within bioanalytics for efficient disease diagnostics. One promising sensing approach relies on combining plasmonically-active waveguides with microfluidics yielding an easy-to-use sensing platform. Here we introduce suspended-core fibres containing immobilised plasmonic nanoparticles surrounding the guiding core as a concept for an entirely integrated optofluidic platform for efficient refractive index sensing. Due to the extremely small optical core and the large adjacent microfluidic channels, over two orders of magnitude of nanoparticle coverage densities have been accessed with millimetre-long sample lengths showing refractive index sensitivities of 170 nm/RIU for aqueous analytes where the fibre interior is functionalised by gold nanospheres. Our concept represents a fully integrated optofluidic sensing system demanding small sample volumes and allowing for real-time analyte monitoring, both of which are highly relevant within invasive bioanalytics, particularly within molecular disease diagnostics and environmental science.

  2. Efficient field-theoretic simulation of polymer solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villet, Michael C.; Fredrickson, Glenn H., E-mail: ghf@mrl.ucsb.edu; Department of Materials, University of California, Santa Barbara, California 93106

    2014-12-14

    We present several developments that facilitate the efficient field-theoretic simulation of polymers by complex Langevin sampling. A regularization scheme using finite Gaussian excluded volume interactions is used to derive a polymer solution model that appears free of ultraviolet divergences and hence is well-suited for lattice-discretized field theoretic simulation. We show that such models can exhibit ultraviolet sensitivity, a numerical pathology that dramatically increases sampling error in the continuum lattice limit, and further show that this pathology can be eliminated by appropriate model reformulation by variable transformation. We present an exponential time differencing algorithm for integrating complex Langevin equations for fieldmore » theoretic simulation, and show that the algorithm exhibits excellent accuracy and stability properties for our regularized polymer model. These developments collectively enable substantially more efficient field-theoretic simulation of polymers, and illustrate the importance of simultaneously addressing analytical and numerical pathologies when implementing such computations.« less

  3. Testing of high-volume sampler inlets for the sampling of atmospheric radionuclides.

    PubMed

    Irshad, Hammad; Su, Wei-Chung; Cheng, Yung S; Medici, Fausto

    2006-09-01

    Sampling of air for radioactive particles is one of the most important techniques used to determine the nuclear debris from a nuclear weapon test in the Earth's atmosphere or those particles vented from underground or underwater tests. Massive-flow air samplers are used to sample air for any indication of radionuclides that are a signature of nuclear tests. The International Monitoring System of the Comprehensive Nuclear Test Ban Treaty Organization includes seismic, hydroacoustic, infrasound, and gaseous xenon isotopes sampling technologies, in addition to radionuclide sampling, to monitor for any violation of the treaty. Lovelace Respiratory Research Institute has developed a large wind tunnel to test the outdoor radionuclide samplers for the International Monitoring System. The inlets for these samplers are tested for their collection efficiencies for different particle sizes at various wind speeds. This paper describes the results from the testing of two radionuclide sampling units used in the International Monitoring System. The possible areas of depositional wall losses are identified and the losses in these areas are determined. Sampling inlet type 1 was tested at 2.2 m s wind speed for 5, 10, and 20-microm aerodynamic diameter particles. The global collection efficiency was about 87.6% for 10-microm particles for sampling inlet type 1. Sampling inlet type 2 was tested for three wind speeds at 0.56, 2.2, and 6.6 m s for 5, 10, and 20-microm aerodynamic diameter particles in two different configurations (sampling head lowered and raised). The global collection efficiencies for these configurations for 10-microm particles at 2.2 m s wind speed were 77.4% and 82.5%, respectively. The sampling flow rate was 600 m h for both sampling inlets.

  4. Performance evaluation of an importance sampling technique in a Jackson network

    NASA Astrophysics Data System (ADS)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  5. The correspondence of surface climate parameters with satellite and terrain data

    NASA Technical Reports Server (NTRS)

    Dozier, Jeff; Davis, Frank

    1987-01-01

    One of the goals of the research was to develop a ground sampling stragegy for calibrating remotely sensed measurements of surface climate parameters. The initial sampling strategy involved the stratification of the terrain based on important ancillary surface variables such as slope, exposure, insolation, geology, drainage, fire history, etc. For a spatially heterogeneous population, sampling error is reduced and efficiency increased by stratification of the landscape into more homogeneous sub-areas and by employing periodic random spacing of samples. These concepts were applied in the initial stratification of the study site for the purpose of locating and allocating instrumentation.

  6. PSA discriminator influence on (222)Rn efficiency detection in waters by liquid scintillation counting.

    PubMed

    Stojković, Ivana; Todorović, Nataša; Nikolov, Jovana; Tenjović, Branislava

    2016-06-01

    A procedure for the (222)Rn determination in aqueous samples using liquid scintillation counting (LSC) was evaluated and optimized. Measurements were performed by ultra-low background spectrometer Quantulus 1220™ equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with (226)Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide precise spectra separation. Improvement of calibration procedure was done through investigation of PSA discriminator level and, consequentially, the activity of (226)Ra calibration standard influence on (222)Rn efficiency detection. Quench effects on generated spectra i.e. determination of radon efficiency detection were also investigated with quench calibration curve obtained. Radon determination in waters based on modified procedure according to the activity of (226)Ra standard used, dependent on PSA setup, was evaluated with prepared (226)Ra solution samples and drinking water samples with assessment of measurement uncertainty variation included. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Paper based Flexible and Conformal SERS Substrate for Rapid Trace Detection on Real-world Surfaces

    NASA Astrophysics Data System (ADS)

    Singamaneni, Srikanth; Lee, Chang; Tian, Limei

    2011-03-01

    One of the important but often overlooked considerations in the design of surface enhanced Raman scattering (SERS) substrates for trace detection is the efficiency of sample collection. Conventional designs based on rigid substrates such as silicon, alumina, and glass resist conformal contact with the surface under investigation, making the sample collection inefficient. We demonstrate a novel SERS substrate based on common filter paper adsorbed with gold nanorods, which allows conformal contact with real-world surfaces, thus dramatically enhancing the sample collection efficiency compared to conventional rigid substrates. We demonstrate the detection of trace amounts of analyte (140 pg spread over 4 cm2) by simply swabbing the surface under investigation with the novel SERS substrate. The hierarchical fibrous structure of paper serves as a 3D vasculature for easy uptake and transport of the analytes to the electromagnetic hot spots in the paper. Simple yet highly efficient and cost effective SERS substrate demonstrated here brings SERS based trace detection closer to real-world applications. We acknowledge the financial support from Center for Materials Innovation at Washington University.

  8. Detection of Shigella in Milk and Clinical Samples by Magnetic Immunocaptured-Loop-Mediated Isothermal Amplification Assay

    PubMed Central

    Zhang, Liding; Wei, Qiujiang; Han, Qinqin; Chen, Qiang; Tai, Wenlin; Zhang, Jinyang; Song, Yuzhu; Xia, Xueshan

    2018-01-01

    Shigella is an important human food-borne zoonosis bacterial pathogen, and can cause clinically severe diarrhea. There is an urgent need to develop a specific, sensitive, and rapid methodology for detection of this pathogen. In this study, loop-mediated isothermal amplification (LAMP) combined with magnetic immunocapture assay (IC-LAMP) was first developed for the detection of Shigella in pure culture, artificial milk, and clinical stool samples. This method exhibited a detection limit of 8.7 CFU/mL. Compared with polymerase chain reaction, IC-LAMP is sensitive, specific, and reliable for monitoring Shigella. Additionally, IC-LAMP is more convenient, efficient, and rapid than ordinary LAMP, as it is more efficiently enriches pathogen cells without extraction of genomic DNA. Under isothermal conditions, the amplification curves and the green fluorescence were detected within 30 min in the presence of genomic DNA template. The overall analysis time was approximately 1 h, including the enrichment and lysis of the bacterial cells, a significantly short detection time. Therefore, the IC-LAMP methodology described here is potentially useful for the efficient detection of Shigella in various samples. PMID:29467730

  9. Identification of multiple mRNA and DNA sequences from small tissue samples isolated by laser-assisted microdissection.

    PubMed

    Bernsen, M R; Dijkman, H B; de Vries, E; Figdor, C G; Ruiter, D J; Adema, G J; van Muijen, G N

    1998-10-01

    Molecular analysis of small tissue samples has become increasingly important in biomedical studies. Using a laser dissection microscope and modified nucleic acid isolation protocols, we demonstrate that multiple mRNA as well as DNA sequences can be identified from a single-cell sample. In addition, we show that the specificity of procurement of tissue samples is not compromised by smear contamination resulting from scraping of the microtome knife during sectioning of lesions. The procedures described herein thus allow for efficient RT-PCR or PCR analysis of multiple nucleic acid sequences from small tissue samples obtained by laser-assisted microdissection.

  10. A Preview of Coming Attractions: Classroom Teacher's Idea Notebook.

    ERIC Educational Resources Information Center

    Morin, Joy Ann

    1995-01-01

    Contends that it is important for students to be motivated and well prepared for class units and activities. Describes a "previews of coming attractions" instructional strategy that uses advance organizers to increase information processing efficiency. Includes a sample unit outline illustrating this approach. (CFR)

  11. Improved diffusion Monte Carlo propagators for bosonic systems using Itô calculus

    NASA Astrophysics Data System (ADS)

    Hâkansson, P.; Mella, M.; Bressanini, Dario; Morosi, Gabriele; Patrone, Marta

    2006-11-01

    The construction of importance sampled diffusion Monte Carlo (DMC) schemes accurate to second order in the time step is discussed. A central aspect in obtaining efficient second order schemes is the numerical solution of the stochastic differential equation (SDE) associated with the Fokker-Plank equation responsible for the importance sampling procedure. In this work, stochastic predictor-corrector schemes solving the SDE and consistent with Itô calculus are used in DMC simulations of helium clusters. These schemes are numerically compared with alternative algorithms obtained by splitting the Fokker-Plank operator, an approach that we analyze using the analytical tools provided by Itô calculus. The numerical results show that predictor-corrector methods are indeed accurate to second order in the time step and that they present a smaller time step bias and a better efficiency than second order split-operator derived schemes when computing ensemble averages for bosonic systems. The possible extension of the predictor-corrector methods to higher orders is also discussed.

  12. Efficient Stochastic Rendering of Static and Animated Volumes Using Visibility Sweeps.

    PubMed

    von Radziewsky, Philipp; Kroes, Thomas; Eisemann, Martin; Eisemann, Elmar

    2017-09-01

    Stochastically solving the rendering integral (particularly visibility) is the de-facto standard for physically-based light transport but it is computationally expensive, especially when displaying heterogeneous volumetric data. In this work, we present efficient techniques to speed-up the rendering process via a novel visibility-estimation method in concert with an unbiased importance sampling (involving environmental lighting and visibility inside the volume), filtering, and update techniques for both static and animated scenes. Our major contributions include a progressive estimate of partial occlusions based on a fast sweeping-plane algorithm. These occlusions are stored in an octahedral representation, which can be conveniently transformed into a quadtree-based hierarchy suited for a joint importance sampling. Further, we propose sweep-space filtering, which suppresses the occurrence of fireflies and investigate different update schemes for animated scenes. Our technique is unbiased, requires little precomputation, is highly parallelizable, and is applicable to a various volume data sets, dynamic transfer functions, animated volumes and changing environmental lighting.

  13. Single Cell Proteolytic Assays to Investigate Cancer Clonal Heterogeneity and Cell Dynamics Using an Efficient Cell Loading Scheme

    NASA Astrophysics Data System (ADS)

    Chen, Yu-Chih; Cheng, Yu-Heng; Ingram, Patrick; Yoon, Euisik

    2016-06-01

    Proteolytic degradation of the extracellular matrix (ECM) is critical in cancer invasion, and recent work suggests that heterogeneous cancer populations cooperate in this process. Despite the importance of cell heterogeneity, conventional proteolytic assays measure average activity, requiring thousands of cells and providing limited information about heterogeneity and dynamics. Here, we developed a microfluidic platform that provides high-efficiency cell loading and simple valveless isolation, so the proteolytic activity of a small sample (10-100 cells) can be easily characterized. Combined with a single cell derived (clonal) sphere formation platform, we have successfully demonstrated the importance of microenvironmental cues for proteolytic activity and also investigated the difference between clones. Furthermore, the platform allows monitoring single cells at multiple time points, unveiling different cancer cell line dynamics in proteolytic activity. The presented tool facilitates single cell proteolytic analysis using small samples, and our findings illuminate the heterogeneous and dynamic nature of proteolytic activity.

  14. Inference from Samples of DNA Sequences Using a Two-Locus Model

    PubMed Central

    Griffiths, Robert C.

    2011-01-01

    Abstract Performing inference on contemporary samples of DNA sequence data is an important and challenging task. Computationally intensive methods such as importance sampling (IS) are attractive because they make full use of the available data, but in the presence of recombination the large state space of genealogies can be prohibitive. In this article, we make progress by developing an efficient IS proposal distribution for a two-locus model of sequence data. We show that the proposal developed here leads to much greater efficiency, outperforming existing IS methods that could be adapted to this model. Among several possible applications, the algorithm can be used to find maximum likelihood estimates for mutation and crossover rates, and to perform ancestral inference. We illustrate the method on previously reported sequence data covering two loci either side of the well-studied TAP2 recombination hotspot. The two loci are themselves largely non-recombining, so we obtain a gene tree at each locus and are able to infer in detail the effect of the hotspot on their joint ancestry. We summarize this joint ancestry by introducing the gene graph, a summary of the well-known ancestral recombination graph. PMID:21210733

  15. Efficient fluorescence quenching in electrochemically exfoliated graphene decorated with gold nanoparticles

    NASA Astrophysics Data System (ADS)

    Hurtado-Morales, M.; Ortiz, M.; Acuña, C.; Nerl, H. C.; Nicolosi, V.; Hernández, Y.

    2016-07-01

    High surface area graphene sheets were obtained by electrochemical exfoliation of graphite in an acid medium under constant potential conditions. Filtration and centrifugation processes played an important role in order to obtain stable dispersions in water. Scanning electron microscopy and transmission electron microscopy imaging revealed highly exfoliated crystalline samples of ∼5 μm. Raman, Fourier transform infrared and x-ray photoelectron spectroscopy further confirmed the high quality of the exfoliated material. The electrochemically exfoliated graphene (EEG) was decorated with gold nanoparticles (AuNPs) using sodium cholate as a buffer layer. This approach allowed for a non-covalent functionalization without altering the desirable electronic properties of the EEG. The AuNP-EEG samples were characterized with various techniques including absorbance and fluorescence spectroscopy. These samples displayed a fluorescence signal using an excitation wavelength of 290 nm. The calculated quantum yield (Φ) for these samples was 40.04%, a high efficiency compared to previous studies using solution processable graphene.

  16. Four studies on effects of environmental factors on the quality of National Atmospheric Deposition Program measurements

    USGS Publications Warehouse

    Wetherbee, Gregory A.; Latysh, Natalie E.; Lehmann, Christopher M.B.; Rhodes, Mark F.

    2011-01-01

    Selected aspects of National Atmospheric Deposition Program / National Trends Network (NADP/NTN) protocols are evaluated in four studies. Meteorological conditions have minor impacts on the error in NADP/NTN sampling. Efficiency of frozen precipitation sample collection is lower than for liquid precipitation samples. Variability of NTN measurements is higher for relatively low-intensity deposition of frozen precipitation than for higher-intensity deposition of liquid precipitation. Urbanization of the landscape surrounding NADP/NTN sites is not affecting trends in wet-deposition chemistry data to a measureable degree. Five NADP siting criteria intended to preserve wet-deposition sample integrity have varying degrees of effectiveness. NADP siting criteria for objects within the 90 degrees cones and trees within the 120 degrees cones projected from the collector bucket to sky are important for protecting sample integrity. Tall vegetation, fences, and other objects located within 5 meters of the collectors are related to the frequency of visible sample contamination, indicating the importance of these factors in NADP siting criteria.

  17. An efficient sampling technique for sums of bandpass functions

    NASA Technical Reports Server (NTRS)

    Lawton, W. M.

    1982-01-01

    A well known sampling theorem states that a bandlimited function can be completely determined by its values at a uniformly placed set of points whose density is at least twice the highest frequency component of the function (Nyquist rate). A less familiar but important sampling theorem states that a bandlimited narrowband function can be completely determined by its values at a properly chosen, nonuniformly placed set of points whose density is at least twice the passband width. This allows for efficient digital demodulation of narrowband signals, which are common in sonar, radar and radio interferometry, without the side effect of signal group delay from an analog demodulator. This theorem was extended by developing a technique which allows a finite sum of bandlimited narrowband functions to be determined by its values at a properly chosen, nonuniformly placed set of points whose density can be made arbitrarily close to the sum of the passband widths.

  18. Evaluation of the sustainability of contrasted pig farming systems: economy.

    PubMed

    Ilari-Antoine, E; Bonneau, M; Klauke, T N; Gonzàlez, J; Dourmad, J Y; De Greef, K; Houwers, H W J; Fabrega, E; Zimmer, C; Hviid, M; Van der Oever, B; Edwards, S A

    2014-12-01

    The aim of this paper is to present an efficient tool for evaluating the economy part of the sustainability of pig farming systems. The selected tool IDEA was tested on a sample of farms from 15 contrasted systems in Europe. A statistical analysis was carried out to check the capacity of the indicators to illustrate the variability of the population and to analyze which of these indicators contributed the most towards it. The scores obtained for the farms were consistent with the reality of pig production; the variable distribution showed an important variability of the sample. The principal component analysis and cluster analysis separated the sample into five subgroups, in which the six main indicators significantly differed, which underlines the robustness of the tool. The IDEA method was proven to be easily comprehensible, requiring few initial variables and with an efficient benchmarking system; all six indicators contributed to fully describe a varied and contrasted population.

  19. Spectral Absorption Properties of Aerosol Particles from 350-2500nm

    NASA Technical Reports Server (NTRS)

    Martins, J. Vanderlei; Artaxo, Paulo; Kaufman, Yoram J.; Castanho, Andrea D.; Remer, Lorraine A.

    2009-01-01

    The aerosol spectral absorption efficiency (alpha (sub a) in square meters per gram) is measured over an extended wavelength range (350 2500 nm) using an improved calibrated and validated reflectance technique and applied to urban aerosol samples from Sao Paulo, Brazil and from a site in Virginia, Eastern US, that experiences transported urban/industrial aerosol. The average alpha (sub a) values (approximately 3 square meters per gram at 550 nm) for Sao Paulo samples are 10 times larger than alpha (sub a) values obtained for aerosols in Virginia. Sao Paulo aerosols also show evidence of enhanced UV absorption in selected samples, probably associated with organic aerosol components. This extra UV absorption can double the absorption efficiency observed from black carbon alone, therefore reducing by up to 50% the surface UV fluxes, with important implications for climate, UV photolysis rates, and remote sensing from space.

  20. PRODUCTION OF HYDRATED ELECTRONS FROM PHOTOIONIZATION OF DISSOLVED ORGANIC MATTER IN NATURAL WATERS

    EPA Science Inventory

    Under UV irradiation, an important primary photochemical reaction of colored dissolved organic matter (CDOM) is electron ejection, producing hydrated electrons (e-aq). The efficiency of this process has been studied in both fresh and seawater samples with both steady-state scave...

  1. High-density grids for efficient data collection from multiple crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  2. High-density grids for efficient data collection from multiple crystals

    PubMed Central

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; Barnes, Christopher O.; Bonagura, Christopher A.; Brehmer, Winnie; Brunger, Axel T.; Calero, Guillermo; Caradoc-Davies, Tom T.; Chatterjee, Ruchira; Degrado, William F.; Fraser, James S.; Ibrahim, Mohamed; Kern, Jan; Kobilka, Brian K.; Kruse, Andrew C.; Larsson, Karl M.; Lemke, Heinrik T.; Lyubimov, Artem Y.; Manglik, Aashish; McPhillips, Scott E.; Norgren, Erik; Pang, Siew S.; Soltis, S. M.; Song, Jinhu; Thomaston, Jessica; Tsai, Yingssu; Weis, William I.; Woldeyes, Rahel A.; Yachandra, Vittal; Yano, Junko; Zouni, Athina; Cohen, Aina E.

    2016-01-01

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassette or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into the Blu-Ice/DCSS experimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. Crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures. PMID:26894529

  3. High-density grids for efficient data collection from multiple crystals

    DOE PAGES

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; ...

    2015-11-03

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  4. Network Sampling with Memory: A proposal for more efficient sampling from social networks.

    PubMed

    Mouw, Ted; Verdery, Ashton M

    2012-08-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)-the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a "List" mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a "Search" mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS.

  5. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    PubMed Central

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  6. Postharvest Ultrasound-Assisted Freeze-Thaw Pretreatment Improves the Drying Efficiency, Physicochemical Properties, and Macamide Biosynthesis of Maca (Lepidium meyenii).

    PubMed

    Chen, Jin-Jin; Gong, Peng-Fei; Liu, Yi-Lan; Liu, Bo-Yan; Eggert, Dawn; Guo, Yuan-Heng; Zhao, Ming-Xia; Zhao, Qing-Sheng; Zhao, Bing

    2018-04-01

    A novel technique of ultrasound-assisted freeze-thaw pretreatment (UFP) was developed to improve the drying efficiency of maca and bioactive amide synthesis in maca. The optimal UFP conditions are ultrasonic processing 90 min at 30 °C with 6 freeze-thaw cycles. Samples with freeze-thaw pretreatment (FP), ultrasound pretreatment (UP), and UFP were prepared for further comparative study. A no pretreatment (NP) sample was included as a control. The results showed that UFP improved the drying efficiency of maca slices, showing the highest effective moisture diffusivity (1.75 × 10 -9 m 2 /s). This result was further supported by low-field nuclear magnetic resonance (LF-NMR) analysis and scanning electron microscopy (SEM). The rehydration capacity and protein content of maca slices were improved by UFP. More importantly, contents of bioactive macamides and their biosynthetic precursors were increased in 2.5- and 10-fold, respectively. In conclusion, UFP is an efficient technique to improve drying efficiency, physicochemical properties, and bioactive macamides of maca, which can be applied in the industrial manufacture of maca products. © 2018 Institute of Food Technologists®.

  7. Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Kurtz, Nolan Scot

    2014-09-01

    The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less

  8. Measuring the efficiency of large pharmaceutical companies: an industry analysis.

    PubMed

    Gascón, Fernando; Lozano, Jesús; Ponte, Borja; de la Fuente, David

    2017-06-01

    This paper evaluates the relative efficiency of a sample of 37 large pharmaceutical laboratories in the period 2008-2013 using a data envelopment analysis (DEA) approach. We describe in detail the procedure followed to select and construct relevant inputs and outputs that characterize the production and innovation activity of these pharmaceutical firms. Models are estimated with financial information from Datastream, including R&D investment, and the number of new drugs authorized by the European Medicines Agency (EMA) and the US Food and Drug Administration (FDA) considering the time effect. The relative performances of these firms-taking into consideration the strategic importance of R&D-suggest that the pharmaceutical industry is a highly competitive sector given that there are many laboratories at the efficient frontier and many inefficient laboratories close to this border. Additionally, we use data from S&P Capital IQ to analyze 2071 financial transactions announced by our sample of laboratories as an alternative way to gain access to new drugs, and we link these transactions with R&D investment and DEA efficiency. We find that efficient laboratories make on average more financial transactions, and the relative size of each transaction is larger. However, pharmaceutical companies that simultaneously are more efficient and invest more internally in R&D announce smaller transactions relative to total assets.

  9. SPR based immunosensor for detection of Legionella pneumophila in water samples

    NASA Astrophysics Data System (ADS)

    Enrico, De Lorenzis; Manera, Maria G.; Montagna, Giovanni; Cimaglia, Fabio; Chiesa, Maurizio; Poltronieri, Palmiro; Santino, Angelo; Rella, Roberto

    2013-05-01

    Detection of legionellae by water sampling is an important factor in epidemiological investigations of Legionnaires' disease and its prevention. To avoid labor-intensive problems with conventional methods, an alternative, highly sensitive and simple method is proposed for detecting L. pneumophila in aqueous samples. A compact Surface Plasmon Resonance (SPR) instrumentation prototype, provided with proper microfluidics tools, is built. The developed immunosensor is capable of dynamically following the binding between antigens and the corresponding antibody molecules immobilized on the SPR sensor surface. A proper immobilization strategy is used in this work that makes use of an important efficient step aimed at the orientation of antibodies onto the sensor surface. The feasibility of the integration of SPR-based biosensing setups with microfluidic technologies, resulting in a low-cost and portable biosensor is demonstrated.

  10. Fast trace determination of nine odorant and estrogenic chloro- and bromo-phenolic compounds in real water samples through automated solid-phase extraction coupled with liquid chromatography tandem mass spectrometry.

    PubMed

    Yuan, Su-Fen; Liu, Ze-Hua; Lian, Hai-Xian; Yang, Chuang-Tao; Lin, Qing; Yin, Hua; Lin, Zhang; Dang, Zhi

    2018-02-01

    A fast and reliable method was developed for simultaneous trace determination of nine odorous and estrogenic chloro- and bromo-phenolic compounds (CPs and BPs) in water samples using solid-phase extraction (SPE) coupled with liquid chromatography tandem mass spectrometry (LC-MS/MS). For sample preparation, the extraction efficiencies of two widely applied cartridges Oasis HLB and Sep-Pak C18 were compared, and the Oasis HLB cartridge showed much better extraction performance; pH of water sample also plays important role on extraction, and pH = 2-3 was found to be most appropriate. For separation of the target compounds, small addition of ammonium hydroxide can obviously improve the detection sensitivity, and the optimized addition concentration was determined as 0.2%. The developed efficient method was validated and showed excellent linearity (R 2  > 0.995), low limit of detection (LOD, 1.9-6.2 ng/L), and good recovery efficiencies of 57-95% in surface and tap water with low relative standard deviation (RSD, 1.3-17.4%). The developed method was finally applied to one tap and one surface water samples and most of these nine targets were detected, but all of them were below their odor thresholds, and their estrogen equivalent (EEQ) were also very low.

  11. Colorimetric Measurements of Amylase Activity: Improved Accuracy and Efficiency with a Smartphone

    ERIC Educational Resources Information Center

    Dangkulwanich, Manchuta; Kongnithigarn, Kaness; Aurnoppakhun, Nattapat

    2018-01-01

    Routinely used in quantitative determination of various analytes, UV-vis spectroscopy is commonly taught in undergraduate chemistry laboratory courses. Because the technique measures the absorbance of light through the samples, losses from reflection and scattering by large molecules interfere with the measurement. To emphasize the importance of…

  12. Exact Tests for the Rasch Model via Sequential Importance Sampling

    ERIC Educational Resources Information Center

    Chen, Yuguo; Small, Dylan

    2005-01-01

    Rasch proposed an exact conditional inference approach to testing his model but never implemented it because it involves the calculation of a complicated probability. This paper furthers Rasch's approach by (1) providing an efficient Monte Carlo methodology for accurately approximating the required probability and (2) illustrating the usefulness…

  13. Secondary Data Analysis: An Important Tool for Addressing Developmental Questions

    ERIC Educational Resources Information Center

    Greenhoot, Andrea Follmer; Dowsett, Chantelle J.

    2012-01-01

    Existing data sets can be an efficient, powerful, and readily available resource for addressing questions about developmental science. Many of the available databases contain hundreds of variables of interest to developmental psychologists, track participants longitudinally, and have representative samples. In this article, the authors discuss the…

  14. Evaluation of Dried Urine Spot Method to Screen Cotinine among Tobacco Dependents: An Exploratory Study.

    PubMed

    Jain, Raka; Quraishi, Rizwana; Verma, Arpita

    2017-01-01

    Assessment of cotinine, a metabolite of nicotine in body fluids, is an important approach for validating the self-report among tobacco users. Adaptation of assays on dried urine spots (DUSs) has advantages of ease of collection, transportation, minimal invasiveness, and requirement of small volume. The aim of the present study was to develop an efficient method for testing cotinine in DUSs and evaluating its clinical applicability. This involved optimization of conditions for detection, recovery, and stability of cotinine from dried urine, spotted on filter paper. Enzyme-linked immunosorbent assay was used for screening, whereas confirmation was done by gas chromatography. For clinical applicability, urine samples of tobacco users were tested. Water was found to be a suitable extracting solvent as compared to carbonate-bicarbonate buffer (pH 9.2) and saline. Screening was achieved by two punches taken from a 20 μl (diameter 1.3 cm) spotted urine samples, and confirmation was achieved by five complete circles each of 20 μl sample volume. The recovery was found to be 97% in water. Limit of detection for the method was found to be 100 ng/ml. No signs of significant degradation were found under all storage conditions. All the urine samples of tobacco users were found to be positive by a conventional method as well as DUSs, and the method proved to be efficient. DUS samples are a useful alternative for biological monitoring of recent nicotine use, especially in developing countries where sample logistics could be an important concern.

  15. A novel approach to surveying sturgeon using side-scan sonar and occupancy modeling

    USGS Publications Warehouse

    Flowers, H. Jared; Hightower, Joseph E.

    2013-01-01

    Technological advances represent opportunities to enhance and supplement traditional fisheries sampling approaches. One example with growing importance for fisheries research is hydroacoustic technologies such as side-scan sonar. Advantages of side-scan sonar over traditional techniques include the ability to sample large areas efficiently and the potential to survey fish without physical handling-important for species of conservation concern, such as endangered sturgeons. Our objectives were to design an efficient survey methodology for sampling Atlantic Sturgeon Acipenser oxyrinchus by using side-scan sonar and to developmethods for analyzing these data. In North Carolina and South Carolina, we surveyed six rivers thought to contain varying abundances of sturgeon by using a combination of side-scan sonar, telemetry, and video cameras (i.e., to sample jumping sturgeon). Lower reaches of each river near the saltwater-freshwater interface were surveyed on three occasions (generally successive days), and we used occupancy modeling to analyze these data.We were able to detect sturgeon in five of six rivers by using these methods. Side-scan sonar was effective in detecting sturgeon, with estimated gear-specific detection probabilities ranging from 0.2 to 0.5 and river-specific occupancy estimates (per 2-km river segment) ranging from 0.0 to 0.8. Future extensions of this occupancy modeling framework will involve the use of side-scan sonar data to assess sturgeon habitat and abundance in different river systems.

  16. Benchmarking the efficiency of the Chilean water and sewerage companies: a double-bootstrap approach.

    PubMed

    Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon; Villegas, Andrés

    2018-03-01

    Benchmarking the efficiency of water companies is essential to set water tariffs and to promote their sustainability. In doing so, most of the previous studies have applied conventional data envelopment analysis (DEA) models. However, it is a deterministic method that does not allow to identify environmental factors influencing efficiency scores. To overcome this limitation, this paper evaluates the efficiency of a sample of Chilean water and sewerage companies applying a double-bootstrap DEA model. Results evidenced that the ranking of water and sewerage companies changes notably whether efficiency scores are computed applying conventional or double-bootstrap DEA models. Moreover, it was found that the percentage of non-revenue water and customer density are factors influencing the efficiency of Chilean water and sewerage companies. This paper illustrates the importance of using a robust and reliable method to increase the relevance of benchmarking tools.

  17. A Cobb Douglas Stochastic Frontier Model on Measuring Domestic Bank Efficiency in Malaysia

    PubMed Central

    Hasan, Md. Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md. Azizul

    2012-01-01

    Banking system plays an important role in the economic development of any country. Domestic banks, which are the main components of the banking system, have to be efficient; otherwise, they may create obstacle in the process of development in any economy. This study examines the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur Stock Exchange (KLSE) market over the period 2005–2010. A parametric approach, Stochastic Frontier Approach (SFA), is used in this analysis. The findings show that Malaysian domestic banks have exhibited an average overall efficiency of 94 percent, implying that sample banks have wasted an average of 6 percent of their inputs. Among the banks, RHBCAP is found to be highly efficient with a score of 0.986 and PBBANK is noted to have the lowest efficiency with a score of 0.918. The results also show that the level of efficiency has increased during the period of study, and that the technical efficiency effect has fluctuated considerably over time. PMID:22900009

  18. A Cobb Douglas stochastic frontier model on measuring domestic bank efficiency in Malaysia.

    PubMed

    Hasan, Md Zobaer; Kamil, Anton Abdulbasah; Mustafa, Adli; Baten, Md Azizul

    2012-01-01

    Banking system plays an important role in the economic development of any country. Domestic banks, which are the main components of the banking system, have to be efficient; otherwise, they may create obstacle in the process of development in any economy. This study examines the technical efficiency of the Malaysian domestic banks listed in the Kuala Lumpur Stock Exchange (KLSE) market over the period 2005-2010. A parametric approach, Stochastic Frontier Approach (SFA), is used in this analysis. The findings show that Malaysian domestic banks have exhibited an average overall efficiency of 94 percent, implying that sample banks have wasted an average of 6 percent of their inputs. Among the banks, RHBCAP is found to be highly efficient with a score of 0.986 and PBBANK is noted to have the lowest efficiency with a score of 0.918. The results also show that the level of efficiency has increased during the period of study, and that the technical efficiency effect has fluctuated considerably over time.

  19. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, Haiming; Lin, Yaojun; Seidman, David N.

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  20. Preserving correlations between trajectories for efficient path sampling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gingrich, Todd R.; Geissler, Phillip L.; Chemical Sciences Division, Lawrence Berkeley National Laboratory, Berkeley, California 94720

    2015-06-21

    Importance sampling of trajectories has proved a uniquely successful strategy for exploring rare dynamical behaviors of complex systems in an unbiased way. Carrying out this sampling, however, requires an ability to propose changes to dynamical pathways that are substantial, yet sufficiently modest to obtain reasonable acceptance rates. Satisfying this requirement becomes very challenging in the case of long trajectories, due to the characteristic divergences of chaotic dynamics. Here, we examine schemes for addressing this problem, which engineer correlation between a trial trajectory and its reference path, for instance using artificial forces. Our analysis is facilitated by a modern perspective onmore » Markov chain Monte Carlo sampling, inspired by non-equilibrium statistical mechanics, which clarifies the types of sampling strategies that can scale to long trajectories. Viewed in this light, the most promising such strategy guides a trial trajectory by manipulating the sequence of random numbers that advance its stochastic time evolution, as done in a handful of existing methods. In cases where this “noise guidance” synchronizes trajectories effectively, as the Glauber dynamics of a two-dimensional Ising model, we show that efficient path sampling can be achieved for even very long trajectories.« less

  1. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE PAGES

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  2. Dynamics Sampling in Transition Pathway Space.

    PubMed

    Zhou, Hongyu; Tao, Peng

    2018-01-09

    The minimum energy pathway contains important information describing the transition between two states on a potential energy surface (PES). Chain-of-states methods were developed to efficiently calculate minimum energy pathways connecting two stable states. In the chain-of-states framework, a series of structures are generated and optimized to represent the minimum energy pathway connecting two states. However, multiple pathways may exist connecting two existing states and should be identified to obtain a full view of the transitions. Therefore, we developed an enhanced sampling method, named as the direct pathway dynamics sampling (DPDS) method, to facilitate exploration of a PES for multiple pathways connecting two stable states as well as addition minima and their associated transition pathways. In the DPDS method, molecular dynamics simulations are carried out on the targeting PES within a chain-of-states framework to directly sample the transition pathway space. The simulations of DPDS could be regulated by two parameters controlling distance among states along the pathway and smoothness of the pathway. One advantage of the chain-of-states framework is that no specific reaction coordinates are necessary to generate the reaction pathway, because such information is implicitly represented by the structures along the pathway. The chain-of-states setup in a DPDS method greatly enhances the sufficient sampling in high-energy space between two end states, such as transition states. By removing the constraint on the end states of the pathway, DPDS will also sample pathways connecting minima on a PES in addition to the end points of the starting pathway. This feature makes DPDS an ideal method to directly explore transition pathway space. Three examples demonstrate the efficiency of DPDS methods in sampling the high-energy area important for reactions on the PES.

  3. Assessing efficiency of spatial sampling using combined coverage analysis in geographical and feature spaces

    NASA Astrophysics Data System (ADS)

    Hengl, Tomislav

    2015-04-01

    Efficiency of spatial sampling largely determines success of model building. This is especially important for geostatistical mapping where an initial sampling plan should provide a good representation or coverage of both geographical (defined by the study area mask map) and feature space (defined by the multi-dimensional covariates). Otherwise the model will need to extrapolate and, hence, the overall uncertainty of the predictions will be high. In many cases, geostatisticians use point data sets which are produced using unknown or inconsistent sampling algorithms. Many point data sets in environmental sciences suffer from spatial clustering and systematic omission of feature space. But how to quantify these 'representation' problems and how to incorporate this knowledge into model building? The author has developed a generic function called 'spsample.prob' (Global Soil Information Facilities package for R) and which simultaneously determines (effective) inclusion probabilities as an average between the kernel density estimation (geographical spreading of points; analysed using the spatstat package in R) and MaxEnt analysis (feature space spreading of points; analysed using the MaxEnt software used primarily for species distribution modelling). The output 'iprob' map indicates whether the sampling plan has systematically missed some important locations and/or features, and can also be used as an input for geostatistical modelling e.g. as a weight map for geostatistical model fitting. The spsample.prob function can also be used in combination with the accessibility analysis (cost of field survey are usually function of distance from the road network, slope and land cover) to allow for simultaneous maximization of average inclusion probabilities and minimization of total survey costs. The author postulates that, by estimating effective inclusion probabilities using combined geographical and feature space analysis, and by comparing survey costs to representation efficiency, an optimal initial sampling plan can be produced which satisfies both criteria: (a) good representation (i.e. within a tolerance threshold), and (b) minimized survey costs. This sampling analysis framework could become especially interesting for generating sampling plans in new areas e.g. for which no previous spatial prediction model exists. The presentation includes data processing demos with standard soil sampling data sets Ebergotzen (Germany) and Edgeroi (Australia), also available via the GSIF package.

  4. Integrated Hamiltonian sampling: a simple and versatile method for free energy simulations and conformational sampling.

    PubMed

    Mori, Toshifumi; Hamers, Robert J; Pedersen, Joel A; Cui, Qiang

    2014-07-17

    Motivated by specific applications and the recent work of Gao and co-workers on integrated tempering sampling (ITS), we have developed a novel sampling approach referred to as integrated Hamiltonian sampling (IHS). IHS is straightforward to implement and complementary to existing methods for free energy simulation and enhanced configurational sampling. The method carries out sampling using an effective Hamiltonian constructed by integrating the Boltzmann distributions of a series of Hamiltonians. By judiciously selecting the weights of the different Hamiltonians, one achieves rapid transitions among the energy landscapes that underlie different Hamiltonians and therefore an efficient sampling of important regions of the conformational space. Along this line, IHS shares similar motivations as the enveloping distribution sampling (EDS) approach of van Gunsteren and co-workers, although the ways that distributions of different Hamiltonians are integrated are rather different in IHS and EDS. Specifically, we report efficient ways for determining the weights using a combination of histogram flattening and weighted histogram analysis approaches, which make it straightforward to include many end-state and intermediate Hamiltonians in IHS so as to enhance its flexibility. Using several relatively simple condensed phase examples, we illustrate the implementation and application of IHS as well as potential developments for the near future. The relation of IHS to several related sampling methods such as Hamiltonian replica exchange molecular dynamics and λ-dynamics is also briefly discussed.

  5. Monte Carlo sampling in diffusive dynamical systems

    NASA Astrophysics Data System (ADS)

    Tapias, Diego; Sanders, David P.; Altmann, Eduardo G.

    2018-05-01

    We introduce a Monte Carlo algorithm to efficiently compute transport properties of chaotic dynamical systems. Our method exploits the importance sampling technique that favors trajectories in the tail of the distribution of displacements, where deviations from a diffusive process are most prominent. We search for initial conditions using a proposal that correlates states in the Markov chain constructed via a Metropolis-Hastings algorithm. We show that our method outperforms the direct sampling method and also Metropolis-Hastings methods with alternative proposals. We test our general method through numerical simulations in 1D (box-map) and 2D (Lorentz gas) systems.

  6. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  7. Microbial ecology laboratory procedures manual NASA/MSFC

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    1990-01-01

    An essential part of the efficient operation of any microbiology laboratory involved in sample analysis is a standard procedures manual. The purpose of this manual is to provide concise and well defined instructions on routine technical procedures involving sample analysis and methods for monitoring and maintaining quality control within the laboratory. Of equal importance is the safe operation of the laboratory. This manual outlines detailed procedures to be followed in the microbial ecology laboratory to assure safety, analytical control, and validity of results.

  8. An adaptive Bayesian inference algorithm to estimate the parameters of a hazardous atmospheric release

    NASA Astrophysics Data System (ADS)

    Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques

    2015-12-01

    In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.

  9. Coupled multi-group neutron photon transport for the simulation of high-resolution gamma-ray spectroscopy applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, Kimberly A.

    2009-08-01

    The accurate and efficient simulation of coupled neutron-photon problems is necessary for several important radiation detection applications. Examples include the detection of nuclear threats concealed in cargo containers and prompt gamma neutron activation analysis for nondestructive determination of elemental composition of unknown samples.

  10. Public Attitudes towards Socio-Cultural Aspects of Water Supply and Sanitation Services: Palestine as a Case Study

    ERIC Educational Resources Information Center

    Haddad, Marwan

    2005-01-01

    Identifying and considering public attitudes towards various aspects of water supply and sanitation services by planners and decision makers represent an important developmental element relating to the quality, efficiency, and performance of those services. A sample of 1000 Palestinian adults completed a questionnaire assessing attitudes towards…

  11. Exposomics research using suspect screening and non-targeted analysis methods and tools at the U.S. Environmental Protection Agency (ASMS Presentation)

    EPA Science Inventory

    High-resolution mass spectrometry (HRMS) is used for suspect screening (SSA) and non-targeted analysis (NTA) in an attempt to characterize xenobiotic chemicals in various samples broadly and efficiently. These important techniques aid characterization of the exposome, the totalit...

  12. Wildlife Conservation Planning Using Stochastic Optimization and Importance Sampling

    Treesearch

    Robert G. Haight; Laurel E. Travis

    1997-01-01

    Formulations for determining conservation plans for sensitive wildlife species must account for economic costs of habitat protection and uncertainties about how wildlife populations will respond. This paper describes such a formulation and addresses the computational challenge of solving it. The problem is to determine the cost-efficient level of habitat protection...

  13. Deterministic alternatives to the full configuration interaction quantum Monte Carlo method for strongly correlated systems

    NASA Astrophysics Data System (ADS)

    Tubman, Norm; Whaley, Birgitta

    The development of exponential scaling methods has seen great progress in tackling larger systems than previously thought possible. One such technique, full configuration interaction quantum Monte Carlo, allows exact diagonalization through stochastically sampling of determinants. The method derives its utility from the information in the matrix elements of the Hamiltonian, together with a stochastic projected wave function, which are used to explore the important parts of Hilbert space. However, a stochastic representation of the wave function is not required to search Hilbert space efficiently and new deterministic approaches have recently been shown to efficiently find the important parts of determinant space. We shall discuss the technique of Adaptive Sampling Configuration Interaction (ASCI) and the related heat-bath Configuration Interaction approach for ground state and excited state simulations. We will present several applications for strongly correlated Hamiltonians. This work was supported through the Scientific Discovery through Advanced Computing (SciDAC) program funded by the U.S. Department of Energy, Office of Science, Advanced Scientific Computing Research and Basic Energy Sciences.

  14. Temperamental factors in remitted depression: The role of effortful control and attentional mechanisms.

    PubMed

    Marchetti, Igor; Shumake, Jason; Grahek, Ivan; Koster, Ernst H W

    2018-08-01

    Temperamental effortful control and attentional networks are increasingly viewed as important underlying processes in depression and anxiety. However, it is still unknown whether these factors facilitate depressive and anxiety symptoms in the general population and, more specifically, in remitted depressed individuals. We investigated to what extent effortful control and attentional networks (i.e., Attention Network Task) explain concurrent depressive and anxious symptoms in healthy individuals (n = 270) and remitted depressed individuals (n = 90). Both samples were highly representative of the US population. Increased effortful control predicted a substantial decrease in symptoms of both depression and anxiety in the whole sample, whereas decreased efficiency of executive attention predicted a modest increase in depressive symptoms. Remitted depressed individuals did not show less effortful control nor less efficient attentional networks than healthy individuals. Moreover, clinical status did not moderate the relationship between temperamental factors and either depressive or anxiety symptoms. Limitations include the cross-sectional nature of the study. Our study shows that temperamental effortful control represents an important transdiagnostic process for depressive and anxiety symptoms in adults. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  16. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lin, Guang, E-mail: guanglin@purdue.edu

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  17. Extraction of trace tilmicosin in real water samples using ionic liquid-based aqueous two-phase systems.

    PubMed

    Pan, Ru; Shao, Dejia; Qi, Xueyong; Wu, Yun; Fu, Wenyan; Ge, Yanru; Fu, Haizhen

    2013-01-01

    The effective method of ionic liquid-based aqueous two-phase extraction, which involves ionic liquid (IL) (1-butyl-3-methyllimidazolium chloride, [C4mim]Cl) and inorganic salt (K2HPO4) coupled with high-performance liquid chromatography (HPLC), has been used to extract trace tilmicosin in real water samples which were passed through a 0.45 μm filter. The effects of the different types of salts, the concentration of K2HPO4 and of ILs, the pH value and temperature of the systems on the extraction efficiencies have all been investigated. Under the optimum conditions, the average extraction efficiency is up to 95.8%. This method was feasible when applied to the analysis of tilmicosin in real water samples within the range 0.5-40 μg mL(-1). The limit of detection was found to be 0.05 μg mL(-1). The recovery rate of tilmicosin was 92.0-99.0% from the real water samples by the proposed method. This process is suggested to have important applications for the extraction of tilmicosin.

  18. Highly efficient and ultra-small volume separation by pressure-driven liquid chromatography in extended nanochannels.

    PubMed

    Ishibashi, Ryo; Mawatari, Kazuma; Kitamori, Takehiko

    2012-04-23

    The rapidly developing interest in nanofluidic analysis, which is used to examine liquids ranging in amounts from the attoliter to the femtoliter scale, correlates with the recent interest in decreased sample amounts, such as in the field of single-cell analysis. For general nanofluidic analysis, the fact that a pressure-driven flow does not limit the choice of solvents (aqueous or organic) is important. This study shows the first pressure-driven liquid chromatography technique that enables separation of atto- to femtoliter sample volumes, with a high separation efficiency within a few seconds. The apparent diffusion coefficient measurement of the unretentive sample suggests that there is no increase in the viscosity of toluene in the extended nanospace, unlike in aqueous solvents. Evaluation of the normal phase separation, therefore, should involve only the examination of the effect of the small size of the extended nanospace. Compared to a conventionally packed high-performance liquid chromatography column, the separation here results in a faster separation (4 s) by 2 orders of magnitude, a smaller injection volume (10(0) fL) by 9 orders, and a higher separation efficiency (440,000 plates/m) by 1 order. Moreover, the separation behavior agrees with the theory showing that this high efficiency was due to the small and controlled size of the separation channel, where the diffusion through the channel depth direction is fast enough to be neglected. Our chip-based platform should allow direct and real-time analysis or screening of ultralow volume of sample. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Effects of rainfall events on the occurrence and detection efficiency of viruses in river water impacted by combined sewer overflows.

    PubMed

    Hata, Akihiko; Katayama, Hiroyuki; Kojima, Keisuke; Sano, Shoichi; Kasuga, Ikuro; Kitajima, Masaaki; Furumai, Hiroaki

    2014-01-15

    Rainfall events can introduce large amount of microbial contaminants including human enteric viruses into surface water by intermittent discharges from combined sewer overflows (CSOs). The present study aimed to investigate the effect of rainfall events on viral loads in surface waters impacted by CSO and the reliability of molecular methods for detection of enteric viruses. The reliability of virus detection in the samples was assessed by using process controls for virus concentration, nucleic acid extraction and reverse transcription (RT)-quantitative PCR (qPCR) steps, which allowed accurate estimation of virus detection efficiencies. Recovery efficiencies of poliovirus in river water samples collected during rainfall events (<10%) were lower than those during dry weather conditions (>10%). The log10-transformed virus concentration efficiency was negatively correlated with suspended solid concentration (r(2)=0.86) that increased significantly during rainfall events. Efficiencies of DNA extraction and qPCR steps determined with adenovirus type 5 and a primer sharing control, respectively, were lower in dry weather. However, no clear relationship was observed between organic water quality parameters and efficiencies of these two steps. Observed concentrations of indigenous enteric adenoviruses, GII-noroviruses, enteroviruses, and Aichi viruses increased during rainfall events even though the virus concentration efficiency was presumed to be lower than in dry weather. The present study highlights the importance of using appropriate process controls to evaluate accurately the concentration of water borne enteric viruses in natural waters impacted by wastewater discharge, stormwater, and CSOs. © 2013.

  20. Patient and Sample Identification. Out of the Maze?

    PubMed

    Lippi, Giuseppe; Chiozza, Laura; Mattiuzzi, Camilla; Plebani, Mario

    2017-04-01

    Patient and sample misidentification may cause significant harm or discomfort to the patients, especially when incorrect data is used for performing specific healthcare activities. It is hence obvious that efficient and quality care can only start from accurate patient identification. There are many opportunities for misidentification in healthcare and laboratory medicine, including homonymy, incorrect patient registration, reliance on wrong patient data, mistakes in order entry, collection of biological specimens from wrong patients, inappropriate sample labeling and inaccurate entry or erroneous transmission of test results through the laboratory information system. Many ongoing efforts are made to prevent this important healthcare problem, entailing streamlined strategies for identifying patients throughout the healthcare industry by means of traditional and innovative identifiers, as well as using technologic tools that may enhance both the quality and efficiency of blood tubes labeling. The aim of this article is to provide an overview about the liability of identification errors in healthcare, thus providing a pragmatic approach for diverging the so-called patient identification crisis.

  1. Surface Sampling Collection and Culture Methods for Escherichia coli in Household Environments with High Fecal Contamination

    PubMed Central

    Kosek, Margaret N.; Schwab, Kellogg J.

    2017-01-01

    Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 (p < 0.0001) and 0.91 (p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials. PMID:28829392

  2. Surface Sampling Collection and Culture Methods for Escherichia coli in Household Environments with High Fecal Contamination.

    PubMed

    Exum, Natalie G; Kosek, Margaret N; Davis, Meghan F; Schwab, Kellogg J

    2017-08-22

    Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 ( p < 0.0001) and 0.91 ( p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials.

  3. Estimating means and variances: The comparative efficiency of composite and grab samples.

    PubMed

    Brumelle, S; Nemetz, P; Casey, D

    1984-03-01

    This paper compares the efficiencies of two sampling techniques for estimating a population mean and variance. One procedure, called grab sampling, consists of collecting and analyzing one sample per period. The second procedure, called composite sampling, collectsn samples per period which are then pooled and analyzed as a single sample. We review the well known fact that composite sampling provides a superior estimate of the mean. However, it is somewhat surprising that composite sampling does not always generate a more efficient estimate of the variance. For populations with platykurtic distributions, grab sampling gives a more efficient estimate of the variance, whereas composite sampling is better for leptokurtic distributions. These conditions on kurtosis can be related to peakedness and skewness. For example, a necessary condition for composite sampling to provide a more efficient estimate of the variance is that the population density function evaluated at the mean (i.e.f(μ)) be greater than[Formula: see text]. If[Formula: see text], then a grab sample is more efficient. In spite of this result, however, composite sampling does provide a smaller estimate of standard error than does grab sampling in the context of estimating population means.

  4. Interactions among resource partitioning, sampling effect, and facilitation on the biodiversity effect: a modeling approach.

    PubMed

    Flombaum, Pedro; Sala, Osvaldo E; Rastetter, Edward B

    2014-02-01

    Resource partitioning, facilitation, and sampling effect are the three mechanisms behind the biodiversity effect, which is depicted usually as the effect of plant-species richness on aboveground net primary production. These mechanisms operate simultaneously but their relative importance and interactions are difficult to unravel experimentally. Thus, niche differentiation and facilitation have been lumped together and separated from the sampling effect. Here, we propose three hypotheses about interactions among the three mechanisms and test them using a simulation model. The model simulated water movement through soil and vegetation, and net primary production mimicking the Patagonian steppe. Using the model, we created grass and shrub monocultures and mixtures, controlled root overlap and grass water-use efficiency (WUE) to simulate gradients of biodiversity, resource partitioning and facilitation. The presence of shrubs facilitated grass growth by increasing its WUE and in turn increased the sampling effect, whereas root overlap (resource partitioning) had, on average, no effect on sampling effect. Interestingly, resource partitioning and facilitation interacted so the effect of facilitation on sampling effect decreased as resource partitioning increased. Sampling effect was enhanced by the difference between the two functional groups in their efficiency in using resources. Morphological and physiological differences make one group outperform the other; once these differences were established further differences did not enhance the sampling effect. In addition, grass WUE and root overlap positively influence the biodiversity effect but showed no interactions.

  5. Evaluating the efficiency of a one-square-meter quadrat sampler for riffle-dwelling fish

    USGS Publications Warehouse

    Peterson, J.T.; Rabeni, C.F.

    2001-01-01

    We evaluated the efficacy of a 1-m2 quadrat sampler for collecting riffle-dwelling fishes in an Ozark stream. We used a dual-gear approach to evaluate sampler efficiency in relation to species, fish size, and habitat variables. Quasi-likelihood regression showed sampling efficiency to differ significantly (P 0.05). Sampling efficiency was significantly influenced by physical habitat characteristics. Mean current velocity negatively influenced sampling efficiencies for Cyprinidae (P = 0.009), Cottidae (P = 0.006), and Percidae (P < 0.001), and the amount of cobble substrate negatively influenced sampling efficiencies for Cyprinidae (P = 0.025), Ictaluridae (P < 0.001), and Percidae (P < 0.001). Water temperature negatively influenced sampling efficiency for Cyprinidae (P = 0.009) and Ictaluridae (P = 0.006). Species-richness efficiency was positively influenced (P = 0.002) by percentage of riffle sampled. Under average habitat conditions encountered in stream riffles, the 1-m2 quadrat sampler was most efficient at estimating the densities of Cyprinidae (84%) and Cottidae (80%) and least efficient for Percidae (57%) and Ictaluridae (31%).

  6. The 'number needed to sample' in primary care research. Comparison of two primary care sampling frames for chronic back pain.

    PubMed

    Smith, Blair H; Hannaford, Philip C; Elliott, Alison M; Smith, W Cairns; Chambers, W Alastair

    2005-04-01

    Sampling for primary care research must strike a balance between efficiency and external validity. For most conditions, even a large population sample will yield a small number of cases, yet other sampling techniques risk problems with extrapolation of findings. To compare the efficiency and external validity of two sampling methods for both an intervention study and epidemiological research in primary care--a convenience sample and a general population sample--comparing the response and follow-up rates, the demographic and clinical characteristics of each sample, and calculating the 'number needed to sample' (NNS) for a hypothetical randomized controlled trial. In 1996, we selected two random samples of adults from 29 general practices in Grampian, for an epidemiological study of chronic pain. One sample of 4175 was identified by an electronic questionnaire that listed patients receiving regular analgesic prescriptions--the 'repeat prescription sample'. The other sample of 5036 was identified from all patients on practice lists--the 'general population sample'. Questionnaires, including demographic, pain and general health measures, were sent to all. A similar follow-up questionnaire was sent in 2000 to all those agreeing to participate in further research. We identified a potential group of subjects for a hypothetical trial in primary care based on a recently published trial (those aged 25-64, with severe chronic back pain, willing to participate in further research). The repeat prescription sample produced better response rates than the general sample overall (86% compared with 82%, P < 0.001), from both genders and from the oldest and youngest age groups. The NNS using convenience sampling was 10 for each member of the final potential trial sample, compared with 55 using general population sampling. There were important differences between the samples in age, marital and employment status, social class and educational level. However, among the potential trial sample, there were no demographic differences. Those from the repeat prescription sample had poorer indices than the general population sample in all pain and health measures. The repeat prescription sampling method was approximately five times more efficient than the general population method. However demographic and clinical differences in the repeat prescription sample might hamper extrapolation of findings to the general population, particularly in an epidemiological study, and demonstrate that simple comparison with age and gender of the target population is insufficient.

  7. Legal & ethical compliance when sharing biospecimen.

    PubMed

    Klingstrom, Tomas; Bongcam-Rudloff, Erik; Reichel, Jane

    2018-01-01

    When obtaining samples from biobanks, resolving ethical and legal concerns is a time-consuming task where researchers need to balance the needs of privacy, trust and scientific progress. The Biobanking and Biomolecular Resources Research Infrastructure-Large Prospective Cohorts project has resolved numerous such issues through intense communication between involved researchers and experts in its mission to unite large prospective study sets in Europe. To facilitate efficient communication, it is useful for nonexperts to have an at least basic understanding of the regulatory system for managing biological samples.Laws regulating research oversight are based on national law and normally share core principles founded on international charters. In interview studies among donors, chief concerns are privacy, efficient sample utilization and access to information generated from their samples. Despite a lack of clear evidence regarding which concern takes precedence, scientific as well as public discourse has largely focused on privacy concerns and the right of donors to control the usage of their samples.It is therefore important to proactively deal with ethical and legal issues to avoid complications that delay or prevent samples from being accessed. To help biobank professionals avoid making unnecessary mistakes, we have developed this basic primer covering the relationship between ethics and law, the concept of informed consent and considerations for returning findings to donors. © The Author 2017. Published by Oxford University Press.

  8. Legal & ethical compliance when sharing biospecimen

    PubMed Central

    Klingstrom, Tomas; Bongcam-Rudloff, Erik; Reichel, Jane

    2018-01-01

    Abstract When obtaining samples from biobanks, resolving ethical and legal concerns is a time-consuming task where researchers need to balance the needs of privacy, trust and scientific progress. The Biobanking and Biomolecular Resources Research Infrastructure-Large Prospective Cohorts project has resolved numerous such issues through intense communication between involved researchers and experts in its mission to unite large prospective study sets in Europe. To facilitate efficient communication, it is useful for nonexperts to have an at least basic understanding of the regulatory system for managing biological samples. Laws regulating research oversight are based on national law and normally share core principles founded on international charters. In interview studies among donors, chief concerns are privacy, efficient sample utilization and access to information generated from their samples. Despite a lack of clear evidence regarding which concern takes precedence, scientific as well as public discourse has largely focused on privacy concerns and the right of donors to control the usage of their samples. It is therefore important to proactively deal with ethical and legal issues to avoid complications that delay or prevent samples from being accessed. To help biobank professionals avoid making unnecessary mistakes, we have developed this basic primer covering the relationship between ethics and law, the concept of informed consent and considerations for returning findings to donors. PMID:28460118

  9. Factors affecting beef cattle producer perspectives on feed efficiency.

    PubMed

    Wulfhorst, J D; Ahola, J K; Kane, S L; Keenan, L D; Hill, R A

    2010-11-01

    To establish the basis for implementation of a producer education program, a social assessment of the willingness and barriers to adoption of a measure of feed efficiency in beef cattle [residual feed intake (RFI)] was conducted. A 35-question mailed survey was sent to 1,888 producers acquired from the stratified random sample of the Idaho Cattle Association member list (n = 488), Red Angus Association of America member list (n = 2,208), and Red Angus Association of America bull buyer list (n = 5,325). The adjusted response rate for the survey was 49.9%. Of the survey respondents, 58.7% were commercial cow/calf producers and 41.3% were seedstock producers or operated a combination seedstock/commercial operation. Commercial operations had an average of 223 ± 17 cows and 13 ± 3 bulls, whereas seedstock herds (including combination herds) had slightly fewer cows (206 ± 24) and more bulls (23 ± 6). Both commercial and seedstock operators indicated that calving ease/birth weight was the most important trait used to evaluate genetic merit of breeding bulls. Only 3.8 and 4.8% of commercial and seedstock producers indicated that feed efficiency was the most important characteristic used for bull selection. Binary logistic regression models were used to predict willingness of seedstock producers to begin collecting data for the calculation of RFI on their bulls, or to predict willingness of commercial producers to begin selecting bulls based on RFI data. In response, 49.1% of commercial producers and 43.6% of seedstock producers indicated they were willing to adopt RFI as a measure of feed efficiency. These data indicate that feed efficiency was one of the traits that producers consider important; those who perceive feed efficiency as important tended to be actively involved in data collection on their herds, underpinning the notion that objective assessment was valued and used by some. Additional data collection in a future social assessment will continue to elaborate the proportion of producers who perceive feed efficiency as an increasingly important decision and management tool for beef production.

  10. The effects of neutralized particles on the sampling efficiency of polyurethane foam used to estimate the extrathoracic deposition fraction.

    PubMed

    Tomyn, Ronald L; Sleeth, Darrah K; Thiese, Matthew S; Larson, Rodney R

    2016-01-01

    In addition to chemical composition, the site of deposition of inhaled particles is important for determining the potential health effects from an exposure. As a result, the International Organization for Standardization adopted a particle deposition sampling convention. This includes extrathoracic particle deposition sampling conventions for the anterior nasal passages (ET1) and the posterior nasal and oral passages (ET2). This study assessed how well a polyurethane foam insert placed in an Institute of Occupational Medicine (IOM) sampler can match an extrathoracic deposition sampling convention, while accounting for possible static buildup in the test particles. In this way, the study aimed to assess whether neutralized particles affected the performance of this sampler for estimating extrathoracic particle deposition. A total of three different particle sizes (4.9, 9.5, and 12.8 µm) were used. For each trial, one particle size was introduced into a low-speed wind tunnel with a wind speed set a 0.2 m/s (∼40 ft/min). This wind speed was chosen to closely match the conditions of most indoor working environments. Each particle size was tested twice either neutralized, using a high voltage neutralizer, or left in its normal (non neutralized) state as standard particles. IOM samplers were fitted with a polyurethane foam insert and placed on a rotating mannequin inside the wind tunnel. Foam sampling efficiencies were calculated for all trials to compare against the normalized ET1 sampling deposition convention. The foam sampling efficiencies matched well to the ET1 deposition convention for the larger particle sizes, but had a general trend of underestimating for all three particle sizes. The results of a Wilcoxon Rank Sum Test also showed that only at 4.9 µm was there a statistically significant difference (p-value = 0.03) between the foam sampling efficiency using the standard particles and the neutralized particles. This is interpreted to mean that static buildup may be occurring and neutralizing the particles that are 4.9 µm diameter in size did affect the performance of the foam sampler when estimating extrathoracic particle deposition.

  11. Estimating canopy cover from standard forest inventory measurements in western Oregon

    Treesearch

    Anne McIntosh; Andrew Gray; Steven. Garman

    2012-01-01

    Reliable measures of canopy cover are important in the management of public and private forests. However, direct sampling of canopy cover is both labor- and time-intensive. More efficient methods for estimating percent canopy cover could be empirically derived relationships between more readily measured stand attributes and canopy cover or, alternatively, the use of...

  12. Draft Genome Sequence of the Efficient Bioflocculant-Producing Bacterium Paenibacillus sp. Strain A9

    PubMed Central

    Liu, Jin-liang; Hu, Xiao-min

    2013-01-01

    Paenibacillus sp. strain A9 is an important bioflocculant-producing bacterium, isolated from a soil sample, and is pale pink-pigmented, aerobic, and Gram-positive. Here, we report the draft genome sequence and the initial findings from a preliminary analysis of strain A9, which is a novel species of Paenibacillus. PMID:23618713

  13. Importance Sampling of Word Patterns in DNA and Protein Sequences

    PubMed Central

    Chan, Hock Peng; Chen, Louis H.Y.

    2010-01-01

    Abstract Monte Carlo methods can provide accurate p-value estimates of word counting test statistics and are easy to implement. They are especially attractive when an asymptotic theory is absent or when either the search sequence or the word pattern is too short for the application of asymptotic formulae. Naive direct Monte Carlo is undesirable for the estimation of small probabilities because the associated rare events of interest are seldom generated. We propose instead efficient importance sampling algorithms that use controlled insertion of the desired word patterns on randomly generated sequences. The implementation is illustrated on word patterns of biological interest: palindromes and inverted repeats, patterns arising from position-specific weight matrices (PSWMs), and co-occurrences of pairs of motifs. PMID:21128856

  14. ELF: An Extended-Lagrangian Free Energy Calculation Module for Multiple Molecular Dynamics Engines.

    PubMed

    Chen, Haochuan; Fu, Haohao; Shao, Xueguang; Chipot, Christophe; Cai, Wensheng

    2018-06-18

    Extended adaptive biasing force (eABF), a collective variable (CV)-based importance-sampling algorithm, has proven to be very robust and efficient compared with the original ABF algorithm. Its implementation in Colvars, a software addition to molecular dynamics (MD) engines, is, however, currently limited to NAMD and LAMMPS. To broaden the scope of eABF and its variants, like its generalized form (egABF), and make them available to other MD engines, e.g., GROMACS, AMBER, CP2K, and openMM, we present a PLUMED-based implementation, called extended-Lagrangian free energy calculation (ELF). This implementation can be used as a stand-alone gradient estimator for other CV-based sampling algorithms, such as temperature-accelerated MD (TAMD) and extended-Lagrangian metadynamics (MtD). ELF provides the end user with a convenient framework to help select the best-suited importance-sampling algorithm for a given application without any commitment to a particular MD engine.

  15. Rational BRDF.

    PubMed

    Pacanowski, Romain; Salazar Celis, Oliver; Schlick, Christophe; Granier, Xavier; Poulin, Pierre; Cuyt, Annie

    2012-11-01

    Over the last two decades, much effort has been devoted to accurately measuring Bidirectional Reflectance Distribution Functions (BRDFs) of real-world materials and to use efficiently the resulting data for rendering. Because of their large size, it is difficult to use directly measured BRDFs for real-time applications, and fitting the most sophisticated analytical BRDF models is still a complex task. In this paper, we introduce Rational BRDF, a general-purpose and efficient representation for arbitrary BRDFs, based on Rational Functions (RFs). Using an adapted parametrization, we demonstrate how Rational BRDFs offer 1) a more compact and efficient representation using low-degree RFs, 2) an accurate fitting of measured materials with guaranteed control of the residual error, and 3) efficient importance sampling by applying the same fitting process to determine the inverse of the Cumulative Distribution Function (CDF) generated from the BRDF for use in Monte-Carlo rendering.

  16. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    NASA Astrophysics Data System (ADS)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

  17. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy.

    PubMed

    Wahl, N; Hennig, P; Wieser, H P; Bangert, M

    2017-06-26

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU [Formula: see text] min). The resulting standard deviation (expectation value) of dose show average global [Formula: see text] pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity, while run-times of sampling-based computations are linear in the number of fractions. Using sum sampling within APM, uncertainty propagation can only be accelerated at the cost of reduced accuracy in variance calculations. For probabilistic plan optimization, we were able to approximate the necessary pre-computations within seconds, yielding treatment plans of similar quality as gained from exact uncertainty propagation. APM is suited to enhance the trade-off between speed and accuracy in uncertainty propagation and probabilistic treatment plan optimization, especially in the context of fractionation. This brings fully-fledged APM computations within reach of clinical application.

  18. Parametric dictionary learning for modeling EAP and ODF in diffusion MRI.

    PubMed

    Merlet, Sylvain; Caruyer, Emmanuel; Deriche, Rachid

    2012-01-01

    In this work, we propose an original and efficient approach to exploit the ability of Compressed Sensing (CS) to recover diffusion MRI (dMRI) signals from a limited number of samples while efficiently recovering important diffusion features such as the ensemble average propagator (EAP) and the orientation distribution function (ODF). Some attempts to sparsely represent the diffusion signal have already been performed. However and contrarly to what has been presented in CS dMRI, in this work we propose and advocate the use of a well adapted learned dictionary and show that it leads to a sparser signal estimation as well as to an efficient reconstruction of very important diffusion features. We first propose to learn and design a sparse and parametric dictionary from a set of training diffusion data. Then, we propose a framework to analytically estimate in closed form two important diffusion features: the EAP and the ODF. Various experiments on synthetic, phantom and human brain data have been carried out and promising results with reduced number of atoms have been obtained on diffusion signal reconstruction, thus illustrating the added value of our method over state-of-the-art SHORE and SPF based approaches.

  19. A model for estimating passive integrated transponder (PIT) tag antenna efficiencies for interval-specific emigration rates

    USGS Publications Warehouse

    Horton, G.E.; Dubreuil, T.L.; Letcher, B.H.

    2007-01-01

    Our goal was to understand movement and its interaction with survival for populations of stream salmonids at long-term study sites in the northeastern United States by employing passive integrated transponder (PIT) tags and associated technology. Although our PIT tag antenna arrays spanned the stream channel (at most flows) and were continuously operated, we are aware that aspects of fish behavior, environmental characteristics, and electronic limitations influenced our ability to detect 100% of the emigration from our stream site. Therefore, we required antenna efficiency estimates to adjust observed emigration rates. We obtained such estimates by testing a full-scale physical model of our PIT tag antenna array in a laboratory setting. From the physical model, we developed a statistical model that we used to predict efficiency in the field. The factors most important for predicting efficiency were external radio frequency signal and tag type. For most sampling intervals, there was concordance between the predicted and observed efficiencies, which allowed us to estimate the true emigration rate for our field populations of tagged salmonids. One caveat is that the model's utility may depend on its ability to characterize external radio frequency signals accurately. Another important consideration is the trade-off between the volume of data necessary to model efficiency accurately and the difficulty of storing and manipulating large amounts of data.

  20. Susceptibility-matched plugs for microcoil NMR probes

    NASA Astrophysics Data System (ADS)

    Kc, Ravi; Gowda, Yashas N.; Djukovic, Danijel; Henry, Ian D.; Park, Gregory H. J.; Raftery, Daniel

    2010-07-01

    For mass-limited samples, the residual sample volume outside the detection coil is an important concern, as is good base line resolution. Here, we present the construction and evaluation of magnetic susceptibility-matched plugs for microcoil NMR sample cells which address these issues. Mixed-epoxy glue and ultem tube plugs that have susceptibility values close to those of perfluorocarbon FC-43 (fluorinert) and copper were used in small volume (0.5-2 μL) and larger volume (15-20 μL) thin glass capillary sample cells. Using these plugs, the sample volume efficiency (i.e. ratio of active volume to total sample volume in the microcoil NMR cell) was improved by 6-12-fold without sensitivity and resolution trade-offs. Comparison with laser etched or heat etched microcoil sample cells is provided. The approaches described are potentially useful in metabolomics for biomarkers detection in mass limited biological samples.

  1. Susceptibility-matched plugs for microcoil NMR probes.

    PubMed

    Kc, Ravi; Gowda, Yashas N; Djukovic, Danijel; Henry, Ian D; Park, Gregory H J; Raftery, Daniel

    2010-07-01

    For mass-limited samples, the residual sample volume outside the detection coil is an important concern, as is good base line resolution. Here, we present the construction and evaluation of magnetic susceptibility-matched plugs for microcoil NMR sample cells which address these issues. Mixed-epoxy glue and ultem tube plugs that have susceptibility values close to those of perfluorocarbon FC-43 (fluorinert) and copper were used in small volume (0.5-2 microL) and larger volume (15-20 microL) thin glass capillary sample cells. Using these plugs, the sample volume efficiency (i.e. ratio of active volume to total sample volume in the microcoil NMR cell) was improved by 6-12-fold without sensitivity and resolution trade-offs. Comparison with laser etched or heat etched microcoil sample cells is provided. The approaches described are potentially useful in metabolomics for biomarkers detection in mass limited biological samples. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  2. Volatile organic compounds: sampling methods and their worldwide profile in ambient air.

    PubMed

    Kumar, Anuj; Víden, Ivan

    2007-08-01

    The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.

  3. Susceptibility-matched plugs for microcoil NMR probes

    PubMed Central

    Kc, Ravi; Gowda, Yashas N.; Djukovic, Danijel; Henry, Ian D; Park, Gregory H J; Raftery, Daniel

    2010-01-01

    For mass limited samples, the residual sample volume outside the detection coil is an important concern, as is good base line resolution. Here, we present the construction and evaluation of magnetic susceptibility-matched plugs for microcoil NMR sample cells which address these issues. Mixed-epoxy glue and ultem tube plugs that have susceptibility values close to those of perfluorocarbon FC-43 (fluorinert) and copper were used in small volume (0.5 to 2 μL) and larger volume (15 to 20 μL) thin glass capillary sample cells. Using these plugs, the sample volume efficiency (i.e. ratio of active volume to total sample volume in the microcoil NMR cell) was improved by 6 to 12 fold without sensitivity and resolution trade-offs. Comparison with laser etched or heat etched microcoil sample cells is provided. The approaches described are potentially useful in metabolomics for biomarkers detection in mass limited biological samples. PMID:20510638

  4. Controlled defects in semiconducting carbon nanotubes promote efficient generation and luminescence of trions.

    PubMed

    Brozena, Alexandra H; Leeds, Jarrett D; Zhang, Yin; Fourkas, John T; Wang, YuHuang

    2014-05-27

    We demonstrate efficient creation of defect-bound trions through chemical doping of controlled sp(3) defect sites in semiconducting, single-walled carbon nanotubes. These tricarrier quasi-particles luminesce almost as brightly as their parent excitons, indicating a remarkably efficient conversion of excitons into trions. Substantial populations of trions can be generated at low excitation intensities, even months after a sample has been prepared. Photoluminescence spectroscopy reveals a trion binding energy as high as 262 meV, which is substantially larger than any previously reported values. This discovery may have important ramifications not only for studying the basic physics of trions but also for the application of these species in fields such as photonics, electronics, and bioimaging.

  5. Efficient Construction of Free Energy Profiles of Breathing Metal–Organic Frameworks Using Advanced Molecular Dynamics Simulations

    PubMed Central

    2017-01-01

    In order to reliably predict and understand the breathing behavior of highly flexible metal–organic frameworks from thermodynamic considerations, an accurate estimation of the free energy difference between their different metastable states is a prerequisite. Herein, a variety of free energy estimation methods are thoroughly tested for their ability to construct the free energy profile as a function of the unit cell volume of MIL-53(Al). The methods comprise free energy perturbation, thermodynamic integration, umbrella sampling, metadynamics, and variationally enhanced sampling. A series of molecular dynamics simulations have been performed in the frame of each of the five methods to describe structural transformations in flexible materials with the volume as the collective variable, which offers a unique opportunity to assess their computational efficiency. Subsequently, the most efficient method, umbrella sampling, is used to construct an accurate free energy profile at different temperatures for MIL-53(Al) from first principles at the PBE+D3(BJ) level of theory. This study yields insight into the importance of the different aspects such as entropy contributions and anharmonic contributions on the resulting free energy profile. As such, this thorough study provides unparalleled insight in the thermodynamics of the large structural deformations of flexible materials. PMID:29131647

  6. Efficient Construction of Free Energy Profiles of Breathing Metal-Organic Frameworks Using Advanced Molecular Dynamics Simulations.

    PubMed

    Demuynck, Ruben; Rogge, Sven M J; Vanduyfhuys, Louis; Wieme, Jelle; Waroquier, Michel; Van Speybroeck, Veronique

    2017-12-12

    In order to reliably predict and understand the breathing behavior of highly flexible metal-organic frameworks from thermodynamic considerations, an accurate estimation of the free energy difference between their different metastable states is a prerequisite. Herein, a variety of free energy estimation methods are thoroughly tested for their ability to construct the free energy profile as a function of the unit cell volume of MIL-53(Al). The methods comprise free energy perturbation, thermodynamic integration, umbrella sampling, metadynamics, and variationally enhanced sampling. A series of molecular dynamics simulations have been performed in the frame of each of the five methods to describe structural transformations in flexible materials with the volume as the collective variable, which offers a unique opportunity to assess their computational efficiency. Subsequently, the most efficient method, umbrella sampling, is used to construct an accurate free energy profile at different temperatures for MIL-53(Al) from first principles at the PBE+D3(BJ) level of theory. This study yields insight into the importance of the different aspects such as entropy contributions and anharmonic contributions on the resulting free energy profile. As such, this thorough study provides unparalleled insight in the thermodynamics of the large structural deformations of flexible materials.

  7. Clearing muddied waters: Capture of environmental DNA from turbid waters.

    PubMed

    Williams, Kelly E; Huyvaert, Kathryn P; Piaggio, Antoinette J

    2017-01-01

    Understanding the differences in efficiencies of various methods to concentrate, extract, and amplify environmental DNA (eDNA) is vital for best performance of eDNA detection. Aquatic systems vary in characteristics such as turbidity, eDNA concentration, and inhibitor load, thus affecting eDNA capture efficiency. Application of eDNA techniques to the detection of terrestrial invasive or endangered species may require sampling at intermittent water sources that are used for drinking and cooling; these water bodies may often be stagnant and turbid. We present our best practices technique for the detection of wild pig eDNA in water samples, a protocol that will have wide applicability to the detection of elusive vertebrate species. We determined the best practice for eDNA capture in a turbid water system was to concentrate DNA from a 15 mL water sample via centrifugation, purify DNA with the DNeasy mericon Food kit, and remove inhibitors with Zymo Inhibitor Removal Technology columns. Further, we compared the sensitivity of conventional PCR to quantitative PCR and found that quantitative PCR was more sensitive in detecting lower concentrations of eDNA. We show significant differences in efficiencies among methods in each step of eDNA capture, emphasizing the importance of optimizing best practices for the system of interest.

  8. Clearing muddied waters: Capture of environmental DNA from turbid waters

    PubMed Central

    Huyvaert, Kathryn P.; Piaggio, Antoinette J.

    2017-01-01

    Understanding the differences in efficiencies of various methods to concentrate, extract, and amplify environmental DNA (eDNA) is vital for best performance of eDNA detection. Aquatic systems vary in characteristics such as turbidity, eDNA concentration, and inhibitor load, thus affecting eDNA capture efficiency. Application of eDNA techniques to the detection of terrestrial invasive or endangered species may require sampling at intermittent water sources that are used for drinking and cooling; these water bodies may often be stagnant and turbid. We present our best practices technique for the detection of wild pig eDNA in water samples, a protocol that will have wide applicability to the detection of elusive vertebrate species. We determined the best practice for eDNA capture in a turbid water system was to concentrate DNA from a 15 mL water sample via centrifugation, purify DNA with the DNeasy mericon Food kit, and remove inhibitors with Zymo Inhibitor Removal Technology columns. Further, we compared the sensitivity of conventional PCR to quantitative PCR and found that quantitative PCR was more sensitive in detecting lower concentrations of eDNA. We show significant differences in efficiencies among methods in each step of eDNA capture, emphasizing the importance of optimizing best practices for the system of interest. PMID:28686659

  9. Evaluating different methods used in ethnobotanical and ecological studies to record plant biodiversity

    PubMed Central

    2014-01-01

    Background This study compares the efficiency of identifying the plants in an area of semi-arid Northeast Brazil by methods that a) access the local knowledge used in ethnobotanical studies using semi-structured interviews conducted within the entire community, an inventory interview conducted with two participants using the previously collected vegetation inventory, and a participatory workshop presenting exsiccates and photographs to 32 people and b) inventory the vegetation (phytosociology) in locations with different histories of disturbance using rectangular plots and quadrant points. Methods The proportion of species identified using each method was then compared with Cochran’s Q test. We calculated the use value (UV) of each species using semi-structured interviews; this quantitative index was correlated against values of the vegetation’s structural importance obtained from the sample plot method and point-centered quarter method applied in two areas with different historical usage. The analysis sought to correlate the relative importance of plants to the local community (use value - UV) with the ecological importance of the plants in the vegetation structure (importance value - IV; relative density - RD) by using different sampling methods to analyze the two areas. Results With regard to the methods used for accessing the local knowledge, a difference was observed among the ethnobotanical methods of surveying species (Q = 13.37, df = 2, p = 0.0013): 44 species were identified in the inventory interview, 38 in the participatory workshop and 33 in the semi-structured interviews with the community. There was either no correlation between the UV, relative density (RD) and importance value (IV) of some species, or this correlation was negative. Conclusion It was concluded that the inventory interview was the most efficient method for recording species and their uses, as it allowed more plants to be identified in their original environment. To optimize researchers’ time in future studies, the use of the point-centered quarter method rather than the sample plot method is recommended. PMID:24916833

  10. Effects of forcefield and sampling method in all-atom simulations of inherently disordered proteins: Application to conformational preferences of human amylin

    PubMed Central

    Peng, Enxi; Todorova, Nevena

    2017-01-01

    Although several computational modelling studies have investigated the conformational behaviour of inherently disordered protein (IDP) amylin, discrepancies in identifying its preferred solution conformations still exist between various forcefields and sampling methods used. Human islet amyloid polypeptide has long been a subject of research, both experimentally and theoretically, as the aggregation of this protein is believed to be the lead cause of type-II diabetes. In this work, we present a systematic forcefield assessment using one of the most advanced non-biased sampling techniques, Replica Exchange with Solute Tempering (REST2), by comparing the secondary structure preferences of monomeric amylin in solution. This study also aims to determine the ability of common forcefields to sample a transition of the protein from a helical membrane bound conformation into the disordered solution state of amylin. Our results demonstrated that the CHARMM22* forcefield showed the best ability to sample multiple conformational states inherent for amylin. It is revealed that REST2 yielded results qualitatively consistent with experiments and in quantitative agreement with other sampling methods, however far more computationally efficiently and without any bias. Therefore, combining an unbiased sampling technique such as REST2 with a vigorous forcefield testing could be suggested as an important step in developing an efficient and robust strategy for simulating IDPs. PMID:29023509

  11. Concentration and purification of HIV-1 virions by microfluidic separation of superparamagnetic nanoparticles

    PubMed Central

    Chen, Grace Dongqing; Alberts, Catharina Johanna

    2009-01-01

    The low concentration and complex sample matrix of many clinical and environmental viral samples presents a significant challenge in the development of low cost, point-of-care viral assays. To address this problem, we investigated the use of a microfluidic passive magnetic separator combined with on-chip mixer to both purify and concentrate whole particle HIV-1 virions. Virus-containing plasma samples are first mixed to allow specific binding of the viral particles with antibody-conjugated superparamagnetic nanoparticles, and several passive mixer geometries were assessed for their mixing efficiencies. The virus-nanoparticle complexes are then separated from the plasma in a novel magnetic separation chamber, where packed micron-sized ferromagnetic particles serve as high magnetic gradient concentrators for an externally applied magnetic field. Thereafter, a viral lysis buffer was flowed through the chip and the released HIV proteins were assayed off-chip. Viral protein extraction efficiencies of 62% and 45% were achieved at 10uL/min and 30uL/min throughputs respectively. More importantly, an 80-fold concentration was observed for an initial sample volume of 1mL, and a 44-fold concentration for an initial sample volume of 0.5mL. The system is broadly applicable to microscale sample preparation of any viral sample and can be used for nucleic acid extraction as well as 40–80 fold enrichment of target viruses. PMID:19954210

  12. Effects of forcefield and sampling method in all-atom simulations of inherently disordered proteins: Application to conformational preferences of human amylin.

    PubMed

    Peng, Enxi; Todorova, Nevena; Yarovsky, Irene

    2017-01-01

    Although several computational modelling studies have investigated the conformational behaviour of inherently disordered protein (IDP) amylin, discrepancies in identifying its preferred solution conformations still exist between various forcefields and sampling methods used. Human islet amyloid polypeptide has long been a subject of research, both experimentally and theoretically, as the aggregation of this protein is believed to be the lead cause of type-II diabetes. In this work, we present a systematic forcefield assessment using one of the most advanced non-biased sampling techniques, Replica Exchange with Solute Tempering (REST2), by comparing the secondary structure preferences of monomeric amylin in solution. This study also aims to determine the ability of common forcefields to sample a transition of the protein from a helical membrane bound conformation into the disordered solution state of amylin. Our results demonstrated that the CHARMM22* forcefield showed the best ability to sample multiple conformational states inherent for amylin. It is revealed that REST2 yielded results qualitatively consistent with experiments and in quantitative agreement with other sampling methods, however far more computationally efficiently and without any bias. Therefore, combining an unbiased sampling technique such as REST2 with a vigorous forcefield testing could be suggested as an important step in developing an efficient and robust strategy for simulating IDPs.

  13. Highly sensitive oligothiophene-phenylamine-based dual-functional fluorescence "turn-on" sensor for rapid and simultaneous detection of Al3+ and Fe3+ in environment and food samples.

    PubMed

    Guo, Zongrang; Niu, Qingfen; Li, Tianduo

    2018-07-05

    Developing low-cost and efficient sensors for rapid, selective and sensitive detection of the transition metal ions in environmental and food science is very important. In this study, a novel dual-functional fluorescent "turn-on" sensor 3TP based on oligothiophene-phenylamine Schiff base has been synthesized for discrimination and simultaneous detection of both Al 3+ and Fe 3+ ions with high selectivity and anti-interference over other metal ions. Sensor 3TP displayed a very fast fluorescence-enhanced response towards Al 3+ and Fe 3+ ions with low detection limits (0.177μM for Al 3+ and 0.172μM for Fe 3+ ) and wide pH response range (4.0-12.0). The Al 3+ /Fe 3+ sensing mechanisms were investigated by fluorescence experiments, 1 H NMR titrations, FT-IR and ESI-MS spectra. Importantly, sensor 3TP was served as an efficient solid material for the highly sensitive and selective detection of Fe 3+ on TLC plates. Moreover, the sensor 3TP has been successfully used to detect trace Al 3+ and Fe 3+ in environment and food samples with satisfactory results and good recoveries, revealing a convenient, reliable and accurate method for Al 3+ and Fe 3+ analysis in real samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Highly sensitive oligothiophene-phenylamine-based dual-functional fluorescence "turn-on" sensor for rapid and simultaneous detection of Al3+ and Fe3+ in environment and food samples

    NASA Astrophysics Data System (ADS)

    Guo, Zongrang; Niu, Qingfen; Li, Tianduo

    2018-07-01

    Developing low-cost and efficient sensors for rapid, selective and sensitive detection of the transition metal ions in environmental and food science is very important. In this study, a novel dual-functional fluorescent "turn-on" sensor 3TP based on oligothiophene-phenylamine Schiff base has been synthesized for discrimination and simultaneous detection of both Al3+ and Fe3+ ions with high selectivity and anti-interference over other metal ions. Sensor 3TP displayed a very fast fluorescence-enhanced response towards Al3+ and Fe3+ ions with low detection limits (0.177 μM for Al3+ and 0.172 μM for Fe3+) and wide pH response range (4.0-12.0). The Al3+/Fe3+ sensing mechanisms were investigated by fluorescence experiments, 1H NMR titrations, FT-IR and ESI-MS spectra. Importantly, sensor 3TP was served as an efficient solid material for the highly sensitive and selective detection of Fe3+ on TLC plates. Moreover, the sensor 3TP has been successfully used to detect trace Al3+ and Fe3+ in environment and food samples with satisfactory results and good recoveries, revealing a convenient, reliable and accurate method for Al3+ and Fe3+ analysis in real samples.

  15. General method for rapid purification of native chromatin fragments.

    PubMed

    Kuznetsov, Vyacheslav I; Haws, Spencer A; Fox, Catherine A; Denu, John M

    2018-05-24

    Biochemical, proteomic and epigenetic studies of chromatin rely on the efficient ability to isolate native nucleosomes in high yield and purity. However, isolation of native chromatin suitable for many downstream experiments remains a challenging task. This is especially true for the budding yeast Saccharomyces cerevisiae, which continues to serve as an important model organism for the study of chromatin structure and function. Here, we developed a time- and cost-efficient universal protocol for isolation of native chromatin fragments from yeast, insect, and mammalian cells. The resulting protocol preserves histone posttranslational modification in the native chromatin state, and is applicable for both parallel multi-sample spin-column purification and large scale isolation. This protocol is based on the efficient and stable purification of polynucleosomes, features a combination of optimized cell lysis and purification conditions, three options for chromatin fragmentation, and a novel ion-exchange chromatographic purification strategy.  The procedure will aid chromatin researchers interested in isolating native chromatin material for biochemical studies, and as a mild, acid- and detergent-free sample preparation method for mass-spectrometry analysis. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  16. An efficient gridding reconstruction method for multishot non-Cartesian imaging with correction of off-resonance artifacts.

    PubMed

    Meng, Yuguang; Lei, Hao

    2010-06-01

    An efficient iterative gridding reconstruction method with correction of off-resonance artifacts was developed, which is especially tailored for multiple-shot non-Cartesian imaging. The novelty of the method lies in that the transformation matrix for gridding (T) was constructed as the convolution of two sparse matrices, among which the former is determined by the sampling interval and the spatial distribution of the off-resonance frequencies and the latter by the sampling trajectory and the target grid in the Cartesian space. The resulting T matrix is also sparse and can be solved efficiently with the iterative conjugate gradient algorithm. It was shown that, with the proposed method, the reconstruction speed in multiple-shot non-Cartesian imaging can be improved significantly while retaining high reconstruction fidelity. More important, the method proposed allows tradeoff between the accuracy and the computation time of reconstruction, making customization of the use of such a method in different applications possible. The performance of the proposed method was demonstrated by numerical simulation and multiple-shot spiral imaging on rat brain at 4.7 T. (c) 2010 Wiley-Liss, Inc.

  17. Collection of Aerosolized Human Cytokines Using Teflon® Filters

    PubMed Central

    McKenzie, Jennifer H.; McDevitt, James J.; Fabian, M. Patricia; Hwang, Grace M.; Milton, Donald K.

    2012-01-01

    Background Collection of exhaled breath samples for the analysis of inflammatory biomarkers is an important area of research aimed at improving our ability to diagnose, treat and understand the mechanisms of chronic pulmonary disease. Current collection methods based on condensation of water vapor from exhaled breath yield biomarker levels at or near the detection limits of immunoassays contributing to problems with reproducibility and validity of biomarker measurements. In this study, we compare the collection efficiency of two aerosol-to-liquid sampling devices to a filter-based collection method for recovery of dilute laboratory generated aerosols of human cytokines so as to identify potential alternatives to exhaled breath condensate collection. Methodology/Principal Findings Two aerosol-to-liquid sampling devices, the SKC® Biosampler and Omni 3000™, as well as Teflon® filters were used to collect aerosols of human cytokines generated using a HEART nebulizer and single-pass aerosol chamber setup in order to compare the collection efficiencies of these sampling methods. Additionally, methods for the use of Teflon® filters to collect and measure cytokines recovered from aerosols were developed and evaluated through use of a high-sensitivity multiplex immunoassay. Our results show successful collection of cytokines from pg/m3 aerosol concentrations using Teflon® filters and measurement of cytokine levels in the sub-picogram/mL concentration range using a multiplex immunoassay with sampling times less than 30 minutes. Significant degradation of cytokines was observed due to storage of cytokines in concentrated filter extract solutions as compared to storage of dry filters. Conclusions Use of filter collection methods resulted in significantly higher efficiency of collection than the two aerosol-to-liquid samplers evaluated in our study. The results of this study provide the foundation for a potential new technique to evaluate biomarkers of inflammation in exhaled breath samples. PMID:22574123

  18. Extreme ultraviolet reflection efficiencies of diamond-turned aluminum, polished nickel, and evaporated gold surfaces. [for telescope mirrors

    NASA Technical Reports Server (NTRS)

    Malina, R. F.; Cash, W.

    1978-01-01

    Measured reflection efficiencies are presented for flat samples of diamond-turned aluminum, nickel, and evaporated gold surfaces fabricated by techniques suited for EUV telescopes. The aluminum samples were 6.2-cm-diameter disks of 6061-T6, the electroless nickel samples were formed by plating beryllium disks with 7.5-microns of Kanigen. Gold samples were produced by coating the aluminum and nickel samples with 5 strips of evaporated gold. Reflection efficiencies are given for grazing angles in the 5-75 degree range. The results indicate that for wavelengths over about 100 A, the gold-coated nickel samples yield highest efficiencies. For shorter wavelengths, the nickel samples yield better efficiencies. 500 A is found to be the optimal gold thickness.

  19. The application of DEA (Data Envelopment Analysis) window analysis in the assessment of influence on operational efficiencies after the establishment of branched hospitals.

    PubMed

    Jia, Tongying; Yuan, Huiyun

    2017-04-12

    Many large-scaled public hospitals have established branched hospitals in China. This study is to provide evidence for strategy making on the management and development of multi-branched hospitals by evaluating and comparing the operational efficiencies of different hospitals before and after their establishment of branched hospitals. DEA (Data Envelopment Analysis) window analysis was performed on a 7-year data pool from five public hospitals provided by health authorities and institutional surveys. The operational efficiencies of sample hospitals measured in this study (including technical efficiency, pure technical efficiency and scale efficiency) had overall trends towards increase during this 7-year period of time, however, a temporary downturn occurred shortly after the establishment of branched hospitals; pure technical efficiency contributed more to the improvement of technical efficiency compared to scale efficiency. The establishment of branched-hospitals did not lead to a long-term negative effect on hospital operational efficiencies. Our data indicated the importance of improving scale efficiency via the optimization of organizational management, as well as the advantage of a different form of branch-establishment, merging and reorganization. This study brought an insight into the practical application of DEA window analysis on the assessment of hospital operational efficiencies.

  20. The role of pre-morbid intelligence and cognitive reserve in predicting cognitive efficiency in a sample of Italian elderly.

    PubMed

    Caffò, Alessandro O; Lopez, Antonella; Spano, Giuseppina; Saracino, Giuseppe; Stasolla, Fabrizio; Ciriello, Giuseppe; Grattagliano, Ignazio; Lancioni, Giulio E; Bosco, Andrea

    2016-12-01

    Models of cognitive reserve in aging suggest that individual's life experience (education, working activity, and leisure) can exert a neuroprotective effect against cognitive decline and may represent an important contribution to successful aging. The objective of the present study is to investigate the role of cognitive reserve, pre-morbid intelligence, age, and education level, in predicting cognitive efficiency in a sample of healthy aged individuals and with probable mild cognitive impairment. Two hundred and eight aging participants recruited from the provincial region of Bari (Apulia, Italy) took part in the study. A battery of standardized tests was administered to them to measure cognitive reserve, pre-morbid intelligence, and cognitive efficiency. Protocols for 10 participants were excluded since they did not meet inclusion criteria, and statistical analyses were conducted on data from the remaining 198 participants. A path analysis was used to test the following model: age, education level, and intelligence directly influence cognitive reserve and cognitive efficiency; cognitive reserve mediates the influence of age, education level, and intelligence on cognitive efficiency. Cognitive reserve fully mediates the relationship between pre-morbid intelligence and education level and cognitive efficiency, while age maintains a direct effect on cognitive efficiency. Cognitive reserve appears to exert a protective effect regarding cognitive decline in normal and pathological populations, thus masking, at least in the early phases of neurodegeneration, the decline of memory, orientation, attention, language, and reasoning skills. The assessment of cognitive reserve may represent a useful evaluation supplement in neuropsychological screening protocols of cognitive decline.

  1. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  2. A ground-based method of assessing urban forest structure and ecosystem services

    Treesearch

    David J. Nowak; Daniel E. Crane; Jack C. Stevens; Robert E. Hoehn; Jeffrey T. Walton; Jerry Bond

    2008-01-01

    To properly manage urban forests, it is essential to have data on this important resource. An efficient means to obtain this information is to randomly sample urban areas. To help assess the urban forest structure (e.g., number of trees, species composition, tree sizes, health) and several functions (e.g., air pollution removal, carbon storage and sequestration), the...

  3. Bacterial communities in commercial aircraft high-efficiency particulate air (HEPA) filters assessed by PhyloChip analysis.

    PubMed

    Korves, T M; Piceno, Y M; Tom, L M; Desantis, T Z; Jones, B W; Andersen, G L; Hwang, G M

    2013-02-01

    Air travel can rapidly transport infectious diseases globally. To facilitate the design of biosensors for infectious organisms in commercial aircraft, we characterized bacterial diversity in aircraft air. Samples from 61 aircraft high-efficiency particulate air (HEPA) filters were analyzed with a custom microarray of 16S rRNA gene sequences (PhyloChip), representing bacterial lineages. A total of 606 subfamilies from 41 phyla were detected. The most abundant bacterial subfamilies included bacteria associated with humans, especially skin, gastrointestinal and respiratory tracts, and with water and soil habitats. Operational taxonomic units that contain important human pathogens as well as their close, more benign relatives were detected. When compared to 43 samples of urban outdoor air, aircraft samples differed in composition, with higher relative abundance of Firmicutes and Gammaproteobacteria lineages in aircraft samples, and higher relative abundance of Actinobacteria and Betaproteobacteria lineages in outdoor air samples. In addition, aircraft and outdoor air samples differed in the incidence of taxa containing human pathogens. Overall, these results demonstrate that HEPA filter samples can be used to deeply characterize bacterial diversity in aircraft air and suggest that the presence of close relatives of certain pathogens must be taken into account in probe design for aircraft biosensors. A biosensor that could be deployed in commercial aircraft would be required to function at an extremely low false alarm rate, making an understanding of microbial background important. This study reveals a diverse bacterial background present on aircraft, including bacteria closely related to pathogens of public health concern. Furthermore, this aircraft background is different from outdoor air, suggesting different probes may be needed to detect airborne contaminants to achieve minimal false alarm rates. This study also indicates that aircraft HEPA filters could be used with other molecular techniques to further characterize background bacteria and in investigations in the wake of a disease outbreak. © 2012 John Wiley & Sons A/S.

  4. Uncertainty importance analysis using parametric moment ratio functions.

    PubMed

    Wei, Pengfei; Lu, Zhenzhou; Song, Jingwen

    2014-02-01

    This article presents a new importance analysis framework, called parametric moment ratio function, for measuring the reduction of model output uncertainty when the distribution parameters of inputs are changed, and the emphasis is put on the mean and variance ratio functions with respect to the variances of model inputs. The proposed concepts efficiently guide the analyst to achieve a targeted reduction on the model output mean and variance by operating on the variances of model inputs. The unbiased and progressive unbiased Monte Carlo estimators are also derived for the parametric mean and variance ratio functions, respectively. Only a set of samples is needed for implementing the proposed importance analysis by the proposed estimators, thus the computational cost is free of input dimensionality. An analytical test example with highly nonlinear behavior is introduced for illustrating the engineering significance of the proposed importance analysis technique and verifying the efficiency and convergence of the derived Monte Carlo estimators. Finally, the moment ratio function is applied to a planar 10-bar structure for achieving a targeted 50% reduction of the model output variance. © 2013 Society for Risk Analysis.

  5. Beating the curse of dimension with accurate statistics for the Fokker-Planck equation in complex turbulent systems.

    PubMed

    Chen, Nan; Majda, Andrew J

    2017-12-05

    Solving the Fokker-Planck equation for high-dimensional complex dynamical systems is an important issue. Recently, the authors developed efficient statistically accurate algorithms for solving the Fokker-Planck equations associated with high-dimensional nonlinear turbulent dynamical systems with conditional Gaussian structures, which contain many strong non-Gaussian features such as intermittency and fat-tailed probability density functions (PDFs). The algorithms involve a hybrid strategy with a small number of samples [Formula: see text], where a conditional Gaussian mixture in a high-dimensional subspace via an extremely efficient parametric method is combined with a judicious Gaussian kernel density estimation in the remaining low-dimensional subspace. In this article, two effective strategies are developed and incorporated into these algorithms. The first strategy involves a judicious block decomposition of the conditional covariance matrix such that the evolutions of different blocks have no interactions, which allows an extremely efficient parallel computation due to the small size of each individual block. The second strategy exploits statistical symmetry for a further reduction of [Formula: see text] The resulting algorithms can efficiently solve the Fokker-Planck equation with strongly non-Gaussian PDFs in much higher dimensions even with orders in the millions and thus beat the curse of dimension. The algorithms are applied to a [Formula: see text]-dimensional stochastic coupled FitzHugh-Nagumo model for excitable media. An accurate recovery of both the transient and equilibrium non-Gaussian PDFs requires only [Formula: see text] samples! In addition, the block decomposition facilitates the algorithms to efficiently capture the distinct non-Gaussian features at different locations in a [Formula: see text]-dimensional two-layer inhomogeneous Lorenz 96 model, using only [Formula: see text] samples. Copyright © 2017 the Author(s). Published by PNAS.

  6. Determination of azoxystrobin and chlorothalonil using a methacrylate-based polymer modified with gold nanoparticles as solid-phase extraction sorbent.

    PubMed

    Catalá-Icardo, Mónica; Gómez-Benito, Carmen; Simó-Alfonso, Ernesto Francisco; Herrero-Martínez, José Manuel

    2017-01-01

    This paper describes a novel and sensitive method for extraction, preconcentration, and determination of two important widely used fungicides, azoxystrobin, and chlorothalonil. The developed methodology is based on solid-phase extraction (SPE) using a polymeric material functionalized with gold nanoparticles (AuNPs) as sorbent followed by high-performance liquid chromatography (HPLC) with diode array detector (DAD). Several experimental variables that affect the extraction efficiency such as the eluent volume, sample flow rate, and salt addition were optimized. Under the optimal conditions, the sorbent provided satisfactory enrichment efficiency for both fungicides, high selectivity and excellent reusability (>120 re-uses). The proposed method allowed the detection of 0.05 μg L -1 of the fungicides and gave satisfactory recoveries (75-95 %) when it was applied to drinking and environmental water samples (river, well, tap, irrigation, spring, and sea waters).

  7. Binomial leap methods for simulating stochastic chemical kinetics.

    PubMed

    Tian, Tianhai; Burrage, Kevin

    2004-12-01

    This paper discusses efficient simulation methods for stochastic chemical kinetics. Based on the tau-leap and midpoint tau-leap methods of Gillespie [D. T. Gillespie, J. Chem. Phys. 115, 1716 (2001)], binomial random variables are used in these leap methods rather than Poisson random variables. The motivation for this approach is to improve the efficiency of the Poisson leap methods by using larger stepsizes. Unlike Poisson random variables whose range of sample values is from zero to infinity, binomial random variables have a finite range of sample values. This probabilistic property has been used to restrict possible reaction numbers and to avoid negative molecular numbers in stochastic simulations when larger stepsize is used. In this approach a binomial random variable is defined for a single reaction channel in order to keep the reaction number of this channel below the numbers of molecules that undergo this reaction channel. A sampling technique is also designed for the total reaction number of a reactant species that undergoes two or more reaction channels. Samples for the total reaction number are not greater than the molecular number of this species. In addition, probability properties of the binomial random variables provide stepsize conditions for restricting reaction numbers in a chosen time interval. These stepsize conditions are important properties of robust leap control strategies. Numerical results indicate that the proposed binomial leap methods can be applied to a wide range of chemical reaction systems with very good accuracy and significant improvement on efficiency over existing approaches. (c) 2004 American Institute of Physics.

  8. Investigation of solvent-free MALDI-TOFMS sample preparation methods for the analysis of organometallic and coordination compounds.

    PubMed

    Hughes, Laura; Wyatt, Mark F; Stein, Bridget K; Brenton, A Gareth

    2009-01-15

    An investigation of various solvent-free matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOFMS) sample preparation methods for the characterization of organometallic and coordination compounds is described. Such methods are desirable for insoluble materials, compounds that are only soluble in disadvantageous solvents, or complexes that dissociate in solution, all of which present a major "difficulty" to most mass spectrometry techniques. First-row transition metal acetylacetonate complexes, which have been characterized previously by solution preparation MALDI-TOFMS, were used to evaluate the various solvent-free procedures. These procedures comprise two distinct steps: the first being the efficient "solids mixing" (the mixing of sample and matrix), and the second being the effective transfer of the sample/matrix mixture to the MALDI target plate. This investigation shows that vortex mixing is the most efficient first step and that smearing using a microspatula is the most effective second step. In addition, the second step is shown to be much more critical than the first step in obtaining high-quality data. Case studies of truly insoluble materials highlight the importance of these techniques for the wider chemistry community.

  9. Novel Method for High-Throughput Full-Length IGHV-D-J Sequencing of the Immune Repertoire from Bulk B-Cells with Single-Cell Resolution.

    PubMed

    Vergani, Stefano; Korsunsky, Ilya; Mazzarello, Andrea Nicola; Ferrer, Gerardo; Chiorazzi, Nicholas; Bagnara, Davide

    2017-01-01

    Efficient and accurate high-throughput DNA sequencing of the adaptive immune receptor repertoire (AIRR) is necessary to study immune diversity in healthy subjects and disease-related conditions. The high complexity and diversity of the AIRR coupled with the limited amount of starting material, which can compromise identification of the full biological diversity makes such sequencing particularly challenging. AIRR sequencing protocols often fail to fully capture the sampled AIRR diversity, especially for samples containing restricted numbers of B lymphocytes. Here, we describe a library preparation method for immunoglobulin sequencing that results in an exhaustive full-length repertoire where virtually every sampled B-cell is sequenced. This maximizes the likelihood of identifying and quantifying the entire IGHV-D-J repertoire of a sample, including the detection of rearrangements present in only one cell in the starting population. The methodology establishes the importance of circumventing genetic material dilution in the preamplification phases and incorporates the use of certain described concepts: (1) balancing the starting material amount and depth of sequencing, (2) avoiding IGHV gene-specific amplification, and (3) using Unique Molecular Identifier. Together, this methodology is highly efficient, in particular for detecting rare rearrangements in the sampled population and when only a limited amount of starting material is available.

  10. Unified Model for the Overall Efficiency of Inlets Sampling from Horizontal Aerosol Flows

    NASA Astrophysics Data System (ADS)

    Hangal, Sunil Pralhad

    When sampling aerosols from ambient or industrial air environments, the sampled aerosol must be representative of the aerosol in the free stream. The changes that occur during sampling must be assessed quantitatively so that sampling errors can be compensated for. In this study, unified models have been developed for the overall efficiency of tubular sharp-edged inlets sampling from horizontal aerosol flows oriented at 0 to 90^circ relative to the wind direction in the vertical (pitch) and horizontal plane(yaw). In the unified model, based on experimental data, the aspiration efficiency is represented by a single equation with different inertial parameters at 0 to 60^ circ and 45 to 90^circ . Tnt transmission efficiency is separated into two components: one due to gravitational settling in the boundary layer and the other due to impaction. The gravitational settling component is determined by extending a previously developed isoaxial sampling model to nonisoaxial sampling. The impaction component is determined by a new model that quantifies the particle losses caused by wall impaction. The model also quantifies the additional particle losses resulting from turbulent motion in the vena contracta which is formed in the inlet when the inlet velocity is higher than the wind velocity. When sampling aerosols in ambient or industrial environments with an inlet, small changes in wind direction or physical constraints in positioning the inlet in the system necessitates the assessment of sampling efficiency in both the vertical and horizontal plane. The overall sampling efficiency of tubular inlets has been experimentally investigated in yaw and pitch orientations at 0 to 20 ^circ from horizontal aerosol flows using a wind tunnel facility. The model for overall sampling efficiency has been extended to include both yaw and pitch sampling based on the new data. In this model, the difference between yaw and pitch is expressed by the effect of gravity on the impaction process inside the inlet described by a newly developed gravity effect angle. At yaw, the gravity effect angle on the wall impaction process does not change with sampling angle. At pitch, the gravity effect on the impaction process results in particle loss increase for upward and decrease for downward sampling. Using the unified model, graphical representations have been developed for sampling at small angles. These can be used in the field to determine the overall sampling efficiency of inlets at several operating conditions and the operating conditions that result in an acceptable sampling error. Pitch and diameter factors have been introduced for relating the efficiency values over a wide range of conditions to those of a reference condition. The pitch factor determines the overall sampling efficiency at pitch from yaw values, and the diameter factor determines the overall sampling efficiency at different inlet diameters.

  11. Simultaneous analysis of 70 pesticides using HPlc/MS/MS: a comparison of the multiresidue method of Klein and Alder and the QuEChERS method.

    PubMed

    Riedel, Melanie; Speer, Karl; Stuke, Sven; Schmeer, Karl

    2010-01-01

    Since 2003, two new multipesticide residue methods for screening crops for a large number of pesticides, developed by Klein and Alder and Anastassiades et al. (Quick, Easy, Cheap, Effective, Rugged, and Safe; QuEChERS), have been published. Our intention was to compare these two important methods on the basis of their extraction efficiency, reproducibility, ruggedness, ease of use, and speed. In total, 70 pesticides belonging to numerous different substance classes were analyzed at two concentration levels by applying both methods, using five different representative matrixes. In the case of the QuEChERS method, the results of the three sample preparation steps (crude extract, extract after SPE, and extract after SPE and acidification) were compared with each other and with the results obtained with the Klein and Alder method. The extraction efficiencies of the QuEChERS method were far higher, and the sample preparation was much quicker when the last two steps were omitted. In most cases, the extraction efficiencies after the first step were approximately 100%. With extraction efficiencies of mostly less than 70%, the Klein and Alder method did not compare favorably. Some analytes caused problems during evaluation, mostly due to matrix influences.

  12. Changes in the Mg profile and in dislocations induced by high temperature annealing of blue LEDs

    NASA Astrophysics Data System (ADS)

    Meneghini, M.; Trivellin, N.; Berti, M.; Cesca, T.; Gasparotto, A.; Vinattieri, A.; Bogani, F.; Zhu, D.; Humphreys, C. J.; Meneghesso, G.; Zanoni, E.

    2013-03-01

    The efficiency of the injection and recombination processes in InGaN/GaN LEDs is governed by the properties of the active region of the devices, which strongly depend on the conditions used for the growth of the epitaxial material. To improve device quality, it is very important to understand how the high temperatures used during the growth process can modify the quality of the epitaxial material. With this paper we present a study of the modifications in the properties of InGaN/GaN LED structures induced by high temperature annealing: thermal stress tests were carried out at 900 °C, in nitrogen atmosphere, on selected samples. The efficiency and the recombination dynamics were evaluated by photoluminescence measurements (both integrated and time-resolved), while the properties of the epitaxial material were studied by Secondary Ion Mass Spectroscopy (SIMS) and Rutherford Backscattering (RBS) channeling measurements. Results indicate that exposure to high temperatures may lead to: (i) a significant increase in the photoluminescence efficiency of the devices; (ii) a decrease in the parasitic emission bands located between 380 nm and 400 nm; (iii) an increase in carrier lifetime, as detected by time-resolved photoluminescence measurements. The increase in device efficiency is tentatively ascribed to an improvement in the crystallographic quality of the samples.

  13. Structure optimisation by thermal cycling for the hydrophobic-polar lattice model of protein folding

    NASA Astrophysics Data System (ADS)

    Günther, Florian; Möbius, Arnulf; Schreiber, Michael

    2017-03-01

    The function of a protein depends strongly on its spatial structure. Therefore the transition from an unfolded stage to the functional fold is one of the most important problems in computational molecular biology. Since the corresponding free energy landscapes exhibit huge numbers of local minima, the search for the lowest-energy configurations is very demanding. Because of that, efficient heuristic algorithms are of high value. In the present work, we investigate whether and how the thermal cycling (TC) approach can be applied to the hydrophobic-polar (HP) lattice model of protein folding. Evaluating the efficiency of TC for a set of two- and three-dimensional examples, we compare the performance of this strategy with that of multi-start local search (MSLS) procedures and that of simulated annealing (SA). For this aim, we incorporated several simple but rather efficient modifications into the standard procedures: in particular, a strong improvement was achieved by also allowing energy conserving state modifications. Furthermore, the consideration of ensembles instead of single samples was found to greatly improve the efficiency of TC. In the framework of different benchmarks, for all considered HP sequences, we found TC to be far superior to SA, and to be faster than Wang-Landau sampling.

  14. Uncertainty quantification of seabed parameters for large data volumes along survey tracks with a tempered particle filter

    NASA Astrophysics Data System (ADS)

    Dettmer, J.; Quijano, J. E.; Dosso, S. E.; Holland, C. W.; Mandolesi, E.

    2016-12-01

    Geophysical seabed properties are important for the detection and classification of unexploded ordnance. However, current surveying methods such as vertical seismic profiling, coring, or inversion are of limited use when surveying large areas with high spatial sampling density. We consider surveys based on a source and receiver array towed by an autonomous vehicle which produce large volumes of seabed reflectivity data that contain unprecedented and detailed seabed information. The data are analyzed with a particle filter, which requires efficient reflection-coefficient computation, efficient inversion algorithms and efficient use of computer resources. The filter quantifies information content of multiple sequential data sets by considering results from previous data along the survey track to inform the importance sampling at the current point. Challenges arise from environmental changes along the track where the number of sediment layers and their properties change. This is addressed by a trans-dimensional model in the filter which allows layering complexity to change along a track. Efficiency is improved by likelihood tempering of various particle subsets and including exchange moves (parallel tempering). The filter is implemented on a hybrid computer that combines central processing units (CPUs) and graphics processing units (GPUs) to exploit three levels of parallelism: (1) fine-grained parallel computation of spherical reflection coefficients with a GPU implementation of Levin integration; (2) updating particles by concurrent CPU processes which exchange information using automatic load balancing (coarse grained parallelism); (3) overlapping CPU-GPU communication (a major bottleneck) with GPU computation by staggering CPU access to the multiple GPUs. The algorithm is applied to spherical reflection coefficients for data sets along a 14-km track on the Malta Plateau, Mediterranean Sea. We demonstrate substantial efficiency gains over previous methods. [This research was supported in part by the U.S. Dept of Defense, thought the Strategic Environmental Research and Development Program (SERDP).

  15. Maximizing fluorescence collection efficiency in multiphoton microscopy

    PubMed Central

    Zinter, Joseph P.; Levene, Michael J.

    2011-01-01

    Understanding fluorescence propagation through a multiphoton microscope is of critical importance in designing high performance systems capable of deep tissue imaging. Optical models of a scattering tissue sample and the Olympus 20X 0.95NA microscope objective were used to simulate fluorescence propagation as a function of imaging depth for physiologically relevant scattering parameters. The spatio-angular distribution of fluorescence at the objective back aperture derived from these simulations was used to design a simple, maximally efficient post-objective fluorescence collection system. Monte Carlo simulations corroborated by data from experimental tissue phantoms demonstrate collection efficiency improvements of 50% – 90% over conventional, non-optimized fluorescence collection geometries at large imaging depths. Imaging performance was verified by imaging layer V neurons in mouse cortex to a depth of 850 μm. PMID:21934897

  16. Experimental method for testing diffraction properties of reflection waveguide holograms.

    PubMed

    Xie, Yi; Kang, Ming-Wu; Wang, Bao-Ping

    2014-07-01

    Waveguide holograms' diffraction properties include peak wavelength and diffraction efficiency, which play an important role in determining their display performance. Based on the record and reconstruction theory of reflection waveguide holograms, a novel experimental method for testing diffraction properties is introduced and analyzed in this paper, which uses a plano-convex lens optically contacted to the surface of the substrate plate of the waveguide hologram, so that the diffracted light beam can be easily detected. Then an experiment is implemented. The designed reconstruction wavelength of the test sample is 530 nm, and its diffraction efficiency is 100%. The experimental results are a peak wavelength of 527.7 nm and a diffraction efficiency of 94.1%. It is shown that the tested value corresponds well with the designed value.

  17. Psychometric properties of the Depression Anxiety and Stress Scale-21 in older primary care patients.

    PubMed

    Gloster, Andrew T; Rhoades, Howard M; Novy, Diane; Klotsche, Jens; Senior, Ashley; Kunik, Mark; Wilson, Nancy; Stanley, Melinda A

    2008-10-01

    The Depression Anxiety Stress Scale (DASS) was designed to efficiently measure the core symptoms of anxiety and depression and has demonstrated positive psychometric properties in adult samples of anxiety and depression patients and student samples. Despite these findings, the psychometric properties of the DASS remain untested in older adults, for whom the identification of efficient measures of these constructs is especially important. To determine the psychometric properties of the DASS 21-item version in older adults, we analyzed data from 222 medical patients seeking treatment to manage worry. Consistent with younger samples, a three-factor structure best fit the data. Results also indicated good internal consistency, excellent convergent validity, and good discriminative validity, especially for the Depression scale. Receiver operating curve analyses indicated that the DASS-21 predicted the diagnostic presence of generalized anxiety disorder and depression as well as other commonly used measures. These data suggest that the DASS may be used with older adults in lieu of multiple scales designed to measure similar constructs, thereby reducing participant burden and facilitating assessment in settings with limited assessment resources.

  18. Sampling, feasibility, and priors in data assimilation

    DOE PAGES

    Tu, Xuemin; Morzfeld, Matthias; Miller, Robert N.; ...

    2016-03-01

    Importance sampling algorithms are discussed in detail, with an emphasis on implicit sampling, and applied to data assimilation via particle filters. Implicit sampling makes it possible to use the data to find high-probability samples at relatively low cost, making the assimilation more efficient. A new analysis of the feasibility of data assimilation is presented, showing in detail why feasibility depends on the Frobenius norm of the covariance matrix of the noise and not on the number of variables. A discussion of the convergence of particular particle filters follows. A major open problem in numerical data assimilation is the determination ofmore » appropriate priors, a progress report on recent work on this problem is given. The analysis highlights the need for a careful attention both to the data and to the physics in data assimilation problems.« less

  19. Fidelity Witnesses for Fermionic Quantum Simulations

    NASA Astrophysics Data System (ADS)

    Gluza, M.; Kliesch, M.; Eisert, J.; Aolita, L.

    2018-05-01

    The experimental interest and developments in quantum spin-1 /2 chains has increased uninterruptedly over the past decade. In many instances, the target quantum simulation belongs to the broader class of noninteracting fermionic models, constituting an important benchmark. In spite of this class being analytically efficiently tractable, no direct certification tool has yet been reported for it. In fact, in experiments, certification has almost exclusively relied on notions of quantum state tomography scaling very unfavorably with the system size. Here, we develop experimentally friendly fidelity witnesses for all pure fermionic Gaussian target states. Their expectation value yields a tight lower bound to the fidelity and can be measured efficiently. We derive witnesses in full generality in the Majorana-fermion representation and apply them to experimentally relevant spin-1 /2 chains. Among others, we show how to efficiently certify strongly out-of-equilibrium dynamics in critical Ising chains. At the heart of the measurement scheme is a variant of importance sampling specially tailored to overlaps between covariance matrices. The method is shown to be robust against finite experimental-state infidelities.

  20. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  1. Effect of mineral constituents in the bioleaching of uranium from uraniferous sedimentary rock samples, Southwestern Sinai, Egypt.

    PubMed

    Amin, Maisa M; Elaassy, Ibrahim E; El-Feky, Mohamed G; Sallam, Abdel Sattar M; Talaat, Mona S; Kawady, Nilly A

    2014-08-01

    Bioleaching, like Biotechnology uses microorganisms to extract metals from their ore materials, whereas microbial activity has an appreciable effect on the dissolution of toxic metals and radionuclides. Bioleaching of uranium was carried out with isolated fungi from uraniferous sedimentary rocks from Southwestern Sinai, Egypt. Eight fungal species were isolated from different grades of uraniferous samples. The bio-dissolution experiments showed that Aspergillus niger and Aspergillus terreus exhibited the highest leaching efficiencies of uranium from the studied samples. Through monitoring the bio-dissolution process, the uranium grade and mineralogic constituents of the ore material proved to play an important role in the bioleaching process. The tested samples asserted that the optimum conditions of uranium leaching are: 7 days incubation time, 3% pulp density, 30 °C incubation temperature and pH 3. Both fungi produced the organic acids, namely; oxalic, acetic, citric, formic, malonic, galic and ascorbic in the culture filtrate, indicating an important role in the bioleaching processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. End-point diameter and total length coarse woody debris models for the United States

    Treesearch

    C.W. Woodall; J.A. Westfall; D.C. Lutes; S.N. Oswalt

    2008-01-01

    Coarse woody debris (CWD) may be defined as dead and down trees of a certain minimumsize that are an important forest ecosystem component (e.g., wildlife habitat, carbon stocks, and fuels). Due to field efficiency concerns, some natural resource inventories only measure the attributes of CWD pieces at their point of intersection with a sampling transect (e.g., transect...

  3. Driven-dissipative quantum Monte Carlo method for open quantum systems

    NASA Astrophysics Data System (ADS)

    Nagy, Alexandra; Savona, Vincenzo

    2018-05-01

    We develop a real-time full configuration-interaction quantum Monte Carlo approach to model driven-dissipative open quantum systems with Markovian system-bath coupling. The method enables stochastic sampling of the Liouville-von Neumann time evolution of the density matrix thanks to a massively parallel algorithm, thus providing estimates of observables on the nonequilibrium steady state. We present the underlying theory and introduce an initiator technique and importance sampling to reduce the statistical error. Finally, we demonstrate the efficiency of our approach by applying it to the driven-dissipative two-dimensional X Y Z spin-1/2 model on a lattice.

  4. Advances in the Assessment of Wind Turbine Operating Extreme Loads via More Efficient Calculation Approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, Peter; Damiani, Rick R.; Dykes, Katherine

    2017-01-09

    A new adaptive stratified importance sampling (ASIS) method is proposed as an alternative approach for the calculation of the 50 year extreme load under operational conditions, as in design load case 1.1 of the the International Electrotechnical Commission design standard. ASIS combines elements of the binning and extrapolation technique, currently described by the standard, and of the importance sampling (IS) method to estimate load probability of exceedances (POEs). Whereas a Monte Carlo (MC) approach would lead to the sought level of POE with a daunting number of simulations, IS-based techniques are promising as they target the sampling of the inputmore » parameters on the parts of the distributions that are most responsible for the extreme loads, thus reducing the number of runs required. We compared the various methods on select load channels as output from FAST, an aero-hydro-servo-elastic tool for the design and analysis of wind turbines developed by the National Renewable Energy Laboratory (NREL). Our newly devised method, although still in its infancy in terms of tuning of the subparameters, is comparable to the others in terms of load estimation and its variance versus computational cost, and offers great promise going forward due to the incorporation of adaptivity into the already powerful importance sampling concept.« less

  5. Introduction of agarose gel as a green membrane in electromembrane extraction: An efficient procedure for the extraction of basic drugs with a wide range of polarities.

    PubMed

    Tabani, Hadi; Asadi, Sakine; Nojavan, Saeed; Parsa, Mitra

    2017-05-12

    Developing green methods for analyte extraction is one of the most important topics in the field of sample preparation. In this study, for the first time, agarose gel was used as membrane in electromembrane extraction (EME) without using any organic solvent, for the extraction of four model basic drugs (rivastigmine (RIV), verapamil (VER), amlodipine (AML), and morphine (MOR)) with a wide polarity window (log P from 0.43 to 3.7). Different variables playing vital roles in the proposed method were evaluated and optimized. As a driving force, a 25V electrical field was applied to make the analyte migrate from sample solution with pH 7.0, through the agarose gel 3% (w/v) with 5mm thickness, into an acceptor phase (AP) with pH 2.0. The best extraction efficiency was obtained with an extraction duration of 25min. With this new methodology, MOR with high polarity (log P=0.43) was efficiently extracted without using any carrier or ion pair reagents. Limits of detection (LODs) and quantification (LOQs) were in the ranges of 1.5-1.8ngmL -1 and 5.0-6.0ngmL -1 , respectively. Finally, the proposed method was successfully applied to determine concentrations of the model drugs in the wastewater sample. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Hybrid setup for micro- and nano-computed tomography in the hard X-ray range

    NASA Astrophysics Data System (ADS)

    Fella, Christian; Balles, Andreas; Hanke, Randolf; Last, Arndt; Zabler, Simon

    2017-12-01

    With increasing miniaturization in industry and medical technology, non-destructive testing techniques are an area of ever-increasing importance. In this framework, X-ray microscopy offers an efficient tool for the analysis, understanding, and quality assurance of microscopic samples, in particular as it allows reconstructing three-dimensional data sets of the whole sample's volume via computed tomography (CT). The following article describes a compact X-ray microscope in the hard X-ray regime around 9 keV, based on a highly brilliant liquid-metal-jet source. In comparison to commercially available instruments, it is a hybrid that works in two different modes. The first one is a micro-CT mode without optics, which uses a high-resolution detector to allow scans of samples in the millimeter range with a resolution of 1 μm. The second mode is a microscope, which contains an X-ray optical element to magnify the sample and allows resolving 150 nm features. Changing between the modes is possible without moving the sample. Thus, the instrument represents an important step towards establishing high-resolution laboratory-based multi-mode X-ray microscopy as a standard investigation method.

  7. Improving primary health care facility performance in Ghana: efficiency analysis and fiscal space implications.

    PubMed

    Novignon, Jacob; Nonvignon, Justice

    2017-06-12

    Health centers in Ghana play an important role in health care delivery especially in deprived communities. They usually serve as the first line of service and meet basic health care needs. Unfortunately, these facilities are faced with inadequate resources. While health policy makers seek to increase resources committed to primary healthcare, it is important to understand the nature of inefficiencies that exist in these facilities. Therefore, the objectives of this study are threefold; (i) estimate efficiency among primary health facilities (health centers), (ii) examine the potential fiscal space from improved efficiency and (iii) investigate the efficiency disparities in public and private facilities. Data was from the 2015 Access Bottlenecks, Cost and Equity (ABCE) project conducted by the Institute for Health Metrics and Evaluation. The Stochastic Frontier Analysis (SFA) was used to estimate efficiency of health facilities. Efficiency scores were then used to compute potential savings from improved efficiency. Outpatient visits was used as output while number of personnel, hospital beds, expenditure on other capital items and administration were used as inputs. Disparities in efficiency between public and private facilities was estimated using the Nopo matching decomposition procedure. Average efficiency score across all health centers included in the sample was estimated to be 0.51. Also, average efficiency was estimated to be about 0.65 and 0.50 for private and public facilities, respectively. Significant disparities in efficiency were identified across the various administrative regions. With regards to potential fiscal space, we found that, on average, facilities could save about GH₵11,450.70 (US$7633.80) if efficiency was improved. We also found that fiscal space from efficiency gains varies across rural/urban as well as private/public facilities, if best practices are followed. The matching decomposition showed an efficiency gap of 0.29 between private and public facilities. There is need for primary health facility managers to improve productivity via effective and efficient resource use. Efforts to improve efficiency should focus on training health workers and improving facility environment alongside effective monitoring and evaluation exercises.

  8. Sampling in ecology and evolution - bridging the gap between theory and practice

    USGS Publications Warehouse

    Albert, C.H.; Yoccoz, N.G.; Edwards, T.C.; Graham, C.H.; Zimmermann, N.E.; Thuiller, W.

    2010-01-01

    Sampling is a key issue for answering most ecological and evolutionary questions. The importance of developing a rigorous sampling design tailored to specific questions has already been discussed in the ecological and sampling literature and has provided useful tools and recommendations to sample and analyse ecological data. However, sampling issues are often difficult to overcome in ecological studies due to apparent inconsistencies between theory and practice, often leading to the implementation of simplified sampling designs that suffer from unknown biases. Moreover, we believe that classical sampling principles which are based on estimation of means and variances are insufficient to fully address many ecological questions that rely on estimating relationships between a response and a set of predictor variables over time and space. Our objective is thus to highlight the importance of selecting an appropriate sampling space and an appropriate sampling design. We also emphasize the importance of using prior knowledge of the study system to estimate models or complex parameters and thus better understand ecological patterns and processes generating these patterns. Using a semi-virtual simulation study as an illustration we reveal how the selection of the space (e.g. geographic, climatic), in which the sampling is designed, influences the patterns that can be ultimately detected. We also demonstrate the inefficiency of common sampling designs to reveal response curves between ecological variables and climatic gradients. Further, we show that response-surface methodology, which has rarely been used in ecology, is much more efficient than more traditional methods. Finally, we discuss the use of prior knowledge, simulation studies and model-based designs in defining appropriate sampling designs. We conclude by a call for development of methods to unbiasedly estimate nonlinear ecologically relevant parameters, in order to make inferences while fulfilling requirements of both sampling theory and field work logistics. ?? 2010 The Authors.

  9. Probability of detecting nematode infestations for quarantine sampling with imperfect extraction efficacy

    PubMed Central

    Chen, Peichen; Liu, Shih-Chia; Liu, Hung-I; Chen, Tse-Wei

    2011-01-01

    For quarantine sampling, it is of fundamental importance to determine the probability of finding an infestation when a specified number of units are inspected. In general, current sampling procedures assume 100% probability (perfect) of detecting a pest if it is present within a unit. Ideally, a nematode extraction method should remove all stages of all species with 100% efficiency regardless of season, temperature, or other environmental conditions; in practice however, no method approaches these criteria. In this study we determined the probability of detecting nematode infestations for quarantine sampling with imperfect extraction efficacy. Also, the required sample and the risk involved in detecting nematode infestations with imperfect extraction efficacy are presented. Moreover, we developed a computer program to calculate confidence levels for different scenarios with varying proportions of infestation and efficacy of detection. In addition, a case study, presenting the extraction efficacy of the modified Baermann's Funnel method on Aphelenchoides besseyi, is used to exemplify the use of our program to calculate the probability of detecting nematode infestations in quarantine sampling with imperfect extraction efficacy. The result has important implications for quarantine programs and highlights the need for a very large number of samples if perfect extraction efficacy is not achieved in such programs. We believe that the results of the study will be useful for the determination of realistic goals in the implementation of quarantine sampling. PMID:22791911

  10. Comparison of daily and weekly precipitation sampling efficiencies using automatic collectors

    USGS Publications Warehouse

    Schroder, L.J.; Linthurst, R.A.; Ellson, J.E.; Vozzo, S.F.

    1985-01-01

    Precipitation samples were collected for approximately 90 daily and 50 weekly sampling periods at Finley Farm, near Raleigh, North Carolina from August 1981 through October 1982. Ten wet-deposition samplers (AEROCHEM METRICS MODEL 301) were used; 4 samplers were operated for daily sampling, and 6 samplers were operated for weekly-sampling periods. This design was used to determine if: (1) collection efficiences of precipitation are affected by small distances between the Universal (Belfort) precipitation gage and collector; (2) measurable evaporation loss occurs and (3) pH and specific conductance of precipitation vary significantly within small distances. Average collection efficiencies were 97% for weekly sampling periods compared with the rain gage. Collection efficiencies were examined by seasons and precipitation volume. Neither factor significantly affected collection efficiency. No evaporation loss was found by comparing daily sampling to weekly sampling at the collection site, which was classified as a subtropical climate. Correlation coefficients for pH and specific conductance of daily samples and weekly samples ranged from 0.83 to 0.99.Precipitation samples were collected for approximately 90 daily and 50 weekly sampling periods at Finley farm, near Raleigh, North Carolina from August 1981 through October 1982. Ten wet-deposition samplers were used; 4 samplers were operated for daily sampling, and 6 samplers were operated for weekly-sampling periods. This design was used to determine if: (1) collection efficiencies of precipitation are affected by small distances between the University (Belfort) precipitation gage and collector; (2) measurable evaporation loss occurs and (3) pH and specific conductance of precipitation vary significantly within small distances.

  11. Environmental DNA from Residual Saliva for Efficient Noninvasive Genetic Monitoring of Brown Bears (Ursus arctos)

    PubMed Central

    Wheat, Rachel E.; Allen, Jennifer M.; Miller, Sophie D. L.; Wilmers, Christopher C.; Levi, Taal

    2016-01-01

    Noninvasive genetic sampling is an important tool in wildlife ecology and management, typically relying on hair snaring or scat sampling techniques, but hair snaring is labor and cost intensive, and scats yield relatively low quality DNA. New approaches utilizing environmental DNA (eDNA) may provide supplementary, cost-effective tools for noninvasive genetic sampling. We tested whether eDNA from residual saliva on partially-consumed Pacific salmon (Oncorhynchus spp.) carcasses might yield suitable DNA quality for noninvasive monitoring of brown bears (Ursus arctos). We compared the efficiency of monitoring brown bear populations using both fecal DNA and salivary eDNA collected from partially-consumed salmon carcasses in Southeast Alaska. We swabbed a range of tissue types from 156 partially-consumed salmon carcasses from a midseason run of lakeshore-spawning sockeye (O. nerka) and a late season run of stream-spawning chum (O. keta) salmon in 2014. We also swabbed a total of 272 scats from the same locations. Saliva swabs collected from the braincases of salmon had the best amplification rate, followed by swabs taken from individual bite holes. Saliva collected from salmon carcasses identified unique individuals more quickly and required much less labor to locate than scat samples. Salmon carcass swabbing is a promising method to aid in efficient and affordable monitoring of bear populations, and suggests that the swabbing of food remains or consumed baits from other animals may be an additional cost-effective and valuable tool in the study of the ecology and population biology of many elusive and/or wide-ranging species. PMID:27828988

  12. 10 CFR Appendix B to Subpart F of... - Sampling Plan For Enforcement Testing

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... sample as follows: ER18MR98.010 where (x 1) is the measured energy efficiency, energy or water (in the...-tailed probability level and a sample size of n 1. Step 6(a). For an Energy Efficiency Standard, compare... an Energy Efficiency Standard, determine the second sample size (n 2) as follows: ER18MR98.015 where...

  13. The role of physical habitat and sampling effort on estimates of benthic macroinvertebrate taxonomic richness at basin and site scales.

    PubMed

    Silva, Déborah R O; Ligeiro, Raphael; Hughes, Robert M; Callisto, Marcos

    2016-06-01

    Taxonomic richness is one of the most important measures of biological diversity in ecological studies, including those with stream macroinvertebrates. However, it is impractical to measure the true richness of any site directly by sampling. Our objective was to evaluate the effect of sampling effort on estimates of macroinvertebrate family and Ephemeroptera, Plecoptera, and Trichoptera (EPT) genera richness at two scales: basin and stream site. In addition, we tried to determine which environmental factors at the site scale most influenced the amount of sampling effort needed. We sampled 39 sites in the Cerrado biome (neotropical savanna). In each site, we obtained 11 equidistant samples of the benthic assemblage and multiple physical habitat measurements. The observed basin-scale richness achieved a consistent estimation from Chao 1, Jack 1, and Jack 2 richness estimators. However, at the site scale, there was a constant increase in the observed number of taxa with increased number of samples. Models that best explained the slope of site-scale sampling curves (representing the necessity of greater sampling effort) included metrics that describe habitat heterogeneity, habitat structure, anthropogenic disturbance, and water quality, for both macroinvertebrate family and EPT genera richness. Our results demonstrate the importance of considering basin- and site-scale sampling effort in ecological surveys and that taxa accumulation curves and richness estimators are good tools for assessing sampling efficiency. The physical habitat explained a significant amount of the sampling effort needed. Therefore, future studies should explore the possible implications of physical habitat characteristics when developing sampling objectives, study designs, and calculating the needed sampling effort.

  14. Emission rate and internal quantum efficiency enhancement in different geometrical shapes of GaN LED

    NASA Astrophysics Data System (ADS)

    Rashid, S.; Wahid, M. H. A.; Hambali, N. A. M. Ahmad; Halim, N. S. A. Abdul; Ramli, M. M.; Shahimin, M. M.

    2017-09-01

    This work is based on the development of light emitting diode (LED) using different geometry of top surface on GaN p-n junction structure. Three types of LED chips are designed with different top surface to differ whether p-type layer or p contact plays an important role in improving its efficiency. The voltage applied ranges from 0V to 4V. Current-voltage characteristic for all three samples are obtained and analyzed. The results show that dome shaped of p-type layer operating at 4V increases the emission rate and internal quantum efficiency up to 70%, which is two times higher than basic cylindrically LED chip. Moreover, this new design effectively solved the higher forward voltage problem of the usual curve surface of p-contact GaN LED.

  15. Static versus dynamic sampling for data mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John, G.H.; Langley, P.

    1996-12-31

    As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledgemore » of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.« less

  16. Optoelectronic system of online measurements of unburned carbon in coal fly ash

    NASA Astrophysics Data System (ADS)

    Golas, Janusz; Jankowski, Henryk; Niewczas, Bogdan; Piechna, Janusz; Skiba, Antoni; Szkutnik, Wojciech; Szkutnik, Zdzislaw P.; Wartak, Ryszarda; Worek, Cezary

    2001-08-01

    Carbon-in-ash level is an important consideration for combustion efficiency as well as ash marketing. The optoelectronic analyzing system for on-line determination and monitoring of the u burned carbon content of ash samples is presented. The apparatus operates on the principle that carbon content is proportional to the reflectance of IR light. Ash samples are collected iso kinetically from the flue gas duct and placed in a sample tube with a flat glass bottom. The same is then exposed to a light. The reflectance intensity is used by the system's computer to determine residual carbon content from correlation curves. The sample is then air purged back to the duct or to the attached sample canister to enable laboratory check analysis. The total cycle time takes between 5 and 10 minutes. Real time result of carbon content with accuracy 0.3-0.7 percent are reported and can be used for boiler controlling.

  17. The Conformational Flexibility of the Acyltransferase from the Disorazole Polyketide Synthase Is Revealed by an X-ray Free-Electron Laser Using a Room-Temperature Sample Delivery Method for Serial Crystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathews, Irimpan I.; Allison, Kim; Robbins, Thomas

    The crystal structure of the trans-acyltransferase (AT) from the disorazole polyketide synthase (PKS) was determined at room temperature to a resolution of 2.5 Å using a new method for sample delivery directly into an X-ray free-electron laser. A novel sample extractor efficiently delivered limited quantities of microcrystals directly from the native crystallization solution into the X-ray beam at room temperature. The AT structure revealed important catalytic features of this core PKS enzyme, including the occurrence of conformational changes around the active site. The implications of these conformational changes on polyketide synthase reaction dynamics are discussed.

  18. The Conformational Flexibility of the Acyltransferase from the Disorazole Polyketide Synthase Is Revealed by an X-ray Free-Electron Laser Using a Room-Temperature Sample Delivery Method for Serial Crystallography

    DOE PAGES

    Mathews, Irimpan I.; Allison, Kim; Robbins, Thomas; ...

    2017-08-23

    The crystal structure of the trans-acyltransferase (AT) from the disorazole polyketide synthase (PKS) was determined at room temperature to a resolution of 2.5 Å using a new method for sample delivery directly into an X-ray free-electron laser. A novel sample extractor efficiently delivered limited quantities of microcrystals directly from the native crystallization solution into the X-ray beam at room temperature. The AT structure revealed important catalytic features of this core PKS enzyme, including the occurrence of conformational changes around the active site. The implications of these conformational changes on polyketide synthase reaction dynamics are discussed.

  19. Evaluation of air samplers and filter materials for collection and recovery of airborne norovirus.

    PubMed

    Uhrbrand, K; Koponen, I K; Schultz, A C; Madsen, A M

    2018-04-01

    The aim of this study was to identify the most efficient sampling method for quantitative PCR-based detection of airborne human norovirus (NoV). A comparative experiment was conducted in an aerosol chamber using aerosolized murine norovirus (MNV) as a surrogate for NoV. Sampling was performed using a nylon (NY) filter in conjunction with four kinds of personal samplers: Gesamtstaubprobenahme sampler (GSP), Triplex-cyclone sampler (TC), 3-piece closed-faced Millipore cassette (3P) and a 2-stage NIOSH cyclone sampler (NIO). In addition, sampling was performed using the GSP sampler with four different filter types: NY, polycarbonate (PC), polytetrafluoroethylene (PTFE) and gelatine (GEL). The sampling efficiency of MNV was significantly influenced by both sampler and filter type. The GSP sampler was found to give significantly (P < 0·05) higher recovery of aerosolized MNV than 3P and NIO. A higher recovery was also found for GSP compared with TC, albeit not significantly. Finally, recovery of aerosolized MNV was significantly (P < 0·05) higher using NY than PC, PTFE and GEL filters. The GSP sampler combined with a nylon filter was found to be the best method for personal filter-based sampling of airborne NoV. The identification of a suitable NoV air sampler is an important step towards studying the association between exposure to airborne NoV and infection. © 2017 The Society for Applied Microbiology.

  20. Particle-size distribution (PSD) of pulverized hair: A quantitative approach of milling efficiency and its correlation with drug extraction efficiency.

    PubMed

    Chagas, Aline Garcia da Rosa; Spinelli, Eliani; Fiaux, Sorele Batista; Barreto, Adriana da Silva; Rodrigues, Silvana Vianna

    2017-08-01

    Different types of hair were submitted to different milling procedures and their resulting powders were analyzed by scanning electron microscopy (SEM) and laser diffraction (LD). SEM results were qualitative whereas LD results were quantitative and accurately characterized the hair powders through their particle size distribution (PSD). Different types of hair were submitted to an optimized milling conditions and their PSD was quite similar. A good correlation was obtained between PSD results and ketamine concentration in a hair sample analyzed by LC-MS/MS. Hair samples were frozen in liquid nitrogen for 5min and pulverized at 25Hz for 10min, resulting in 61% of particles <104μm and 39% from 104 to 1000μm. Doing so, a 359% increment on ketamine concentration was obtained for an authentic sample extracted after pulverization comparing with the same sample cut in 1mm fragments. When milling time was extended to 25min, >90% of particles were <60μm and an additional increment of 52.4% in ketamine content was obtained. PSD is a key feature on analysis of pulverized hair as it can affect the method recovery and reproducibility. In addition, PSD is an important issue on sample retesting and quality control procedures. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Solid-phase microextraction-gas chromatography-mass spectrometry for the analysis of selective serotonin reuptake inhibitors in environmental water.

    PubMed

    Lamas, J Pablo; Salgado-Petinal, Carmen; García-Jares, Carmen; Llompart, María; Cela, Rafael; Gómez, Mariano

    2004-08-13

    The continuous contamination of surface waters by pharmaceuticals is of most environmental concern. Selective serotonin reuptake inhibitors (SSRIs) are drugs currently prescribed for the treatment of depressions and other psychiatric disorders and then, they are among the pharmaceuticals that can occur in environmental waters. Solid-phase microextraction (SPME) coupled to gas chromatography-mass spectrometry has been applied to the extraction of five SSRIs--venlafaxine, fluvoxamine, fluoxetine, citalopram and sertraline--from water samples. Some of the analytes were not efficiently extracted as underivatized compounds and so, an in situ acetylation step was introduced in the sample preparation procedure. Different parameters affecting extraction efficiency such as extraction mode, fiber coating and temperature were studied. A mixed-level fractional factorial design was also performed to simultaneously study the influence of other five experimental factors. Finally, a method based on direct SPME at 100 degrees C using polydimethylsiloxane-divinylbenzene fibers is proposed. The performance of the method was evaluated, showing good linearity and precision. The detection limits were in the sub-ng/mL level. Practical applicability was demonstrated through the analysis of real samples. Recoveries obtained for river water and wastewater samples were satisfactory in all cases. An important aspect of the proposed method is that no matrix effects were observed. Two of the target compounds, venlafaxine and citalopram, were detected and quantified in a sewage water sample.

  2. Rheological changes in irradiated chicken eggs

    NASA Astrophysics Data System (ADS)

    Ferreira, Lúcia F. S.; Del Mastro, Nélida L.

    1998-06-01

    Pathogenic bacteria may cause foodborne illnesses. Humans may introduce pathogens into foods during production, processing, distribution and or preparation. Some of these microorganisms are able to survive conventional preservation treatments. Heat pasteurization, which is a well established and satisfactory means of decontamination/disinfection of liquid foods, cannot efficiently achieve a similar objective for solid foods. Extensive work carried out worldwide has shown that irradiation is efficient in eradicating foodborne pathogens like Salmonella spp. that can contaminate poultry products. In this work Co-60 gamma irradiation was applied to samples of industrial powder white, yolk and whole egg at doses between 0 and 25 kGy. Samples were rehydrated and the viscosity measured in a Brookfield viscosimeter, model DV III at 5, 15 and 25°C. The rheological behaviour among the various kinds of samples were markedly different. Irradiation with doses up to 5 kGy, known to reduced bacterial contamination to non-detectable levels, showed almost no variation of viscosity of irradiated egg white samples. On the other hand, whole or yolk egg samples showed some changes in rheological properties depending on the dose level, showing the predominance of whether polimerization or degradation as a result of the irradiation. Additionally, irradiation of yolk egg powder reduced yolk color as a function of the irradiation exposure implemented. The importance of these results are discussed in terms of possible industrial applications.

  3. Comparison of the solid-phase extraction efficiency of a bounded and an included cyclodextrin-silica microporous composite for polycyclic aromatic hydrocarbons determination in water samples.

    PubMed

    Mauri-Aucejo, Adela; Amorós, Pedro; Moragues, Alaina; Guillem, Carmen; Belenguer-Sapiña, Carolina

    2016-08-15

    Solid-phase extraction is one of the most important techniques for sample purification and concentration. A wide variety of solid phases have been used for sample preparation over time. In this work, the efficiency of a new kind of solid-phase extraction adsorbent, which is a microporous material made from modified cyclodextrin bounded to a silica network, is evaluated through an analytical method which combines solid-phase extraction with high-performance liquid chromatography to determine polycyclic aromatic hydrocarbons in water samples. Several parameters that affected the analytes recovery, such as the amount of solid phase, the nature and volume of the eluent or the sample volume and concentration influence have been evaluated. The experimental results indicate that the material possesses adsorption ability to the tested polycyclic aromatic hydrocarbons. Under the optimum conditions, the quantification limits of the method were in the range of 0.09-2.4μgL(-1) and fine linear correlations between peak height and concentration were found around 1.3-70μgL(-1). The method has good repeatability and reproducibility, with coefficients of variation under 8%. Due to the concentration results, this material may represent an alternative for trace analysis of polycyclic aromatic hydrocarbons in water trough solid-phase extraction. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Effective surveillance strategies following a potential classical Swine Fever incursion in a remote wild pig population in North-Western Australia.

    PubMed

    Leslie, E; Cowled, B; Graeme Garner, M; Toribio, J-A L M L; Ward, M P

    2014-10-01

    Early disease detection and efficient methods of proving disease freedom can substantially improve the response to incursions of important transboundary animal diseases in previously free regions. We used a spatially explicit, stochastic disease spread model to simulate the spread of classical swine fever in wild pigs in a remote region of northern Australia and to assess the performance of disease surveillance strategies to detect infection at different time points and to delineate the size of the resulting outbreak. Although disease would likely be detected, simple random sampling was suboptimal. Radial and leapfrog sampling improved the effectiveness of surveillance at various stages of the simulated disease incursion. This work indicates that at earlier stages, radial sampling can reduce epidemic length and achieve faster outbreak delineation and control, but at later stages leapfrog sampling will outperform radial sampling in relation to supporting faster disease control with a less-extensive outbreak area. Due to the complexity of wildlife population dynamics and group behaviour, a targeted approach to surveillance needs to be implemented for the efficient use of resources and time. Using a more situation-based surveillance approach and accounting for disease distribution and the time period over which an epidemic has occurred is the best way to approach the selection of an appropriate surveillance strategy. © 2013 Blackwell Verlag GmbH.

  5. Estimation and modeling of electrofishing capture efficiency for fishes in wadeable warmwater streams

    USGS Publications Warehouse

    Price, A.; Peterson, James T.

    2010-01-01

    Stream fish managers often use fish sample data to inform management decisions affecting fish populations. Fish sample data, however, can be biased by the same factors affecting fish populations. To minimize the effect of sample biases on decision making, biologists need information on the effectiveness of fish sampling methods. We evaluated single-pass backpack electrofishing and seining combined with electrofishing by following a dual-gear, mark–recapture approach in 61 blocknetted sample units within first- to third-order streams. We also estimated fish movement out of unblocked units during sampling. Capture efficiency and fish abundances were modeled for 50 fish species by use of conditional multinomial capture–recapture models. The best-approximating models indicated that capture efficiencies were generally low and differed among species groups based on family or genus. Efficiencies of single-pass electrofishing and seining combined with electrofishing were greatest for Catostomidae and lowest for Ictaluridae. Fish body length and stream habitat characteristics (mean cross-sectional area, wood density, mean current velocity, and turbidity) also were related to capture efficiency of both methods, but the effects differed among species groups. We estimated that, on average, 23% of fish left the unblocked sample units, but net movement varied among species. Our results suggest that (1) common warmwater stream fish sampling methods have low capture efficiency and (2) failure to adjust for incomplete capture may bias estimates of fish abundance. We suggest that managers minimize bias from incomplete capture by adjusting data for site- and species-specific capture efficiency and by choosing sampling gear that provide estimates with minimal bias and variance. Furthermore, if block nets are not used, we recommend that managers adjust the data based on unconditional capture efficiency.

  6. Radiation Doses to Skin from Dermal Contamination

    DTIC Science & Technology

    2010-10-01

    included studies of deposition of particles on skin, hair or clothing of human volunteers and on samples of rat skin or other materials (filter paper ...Particle size probably is the most important parameter that affects interception and retention on skin. In a theoretical part of their paper , Asset and...about 20% of the particles of either diameter (standard deviation about 11%) from such surfaces as cotton, paper , wood, or plastic. The efficiency

  7. Carbon coated magnetic nanoparticles as a novel magnetic solid phase extraction adsorbent for simultaneous extraction of methamphetamine and ephedrine from urine samples.

    PubMed

    Taghvimi, Arezou; Hamishehkar, Hamed

    2017-01-15

    This paper develops a highly selective, specific and efficient method for simultaneous determination of ephedrine and methamphetamine by a new carbon coated magnetic nanoparticles (C/MNPs) as a magnetic solid phase extraction (MSPE) adsorbent in biological urine medium. The characterization of synthesized magnetic nano adsorbent was completely carried out by various characterization techniques like Fourier transform infrared (FT-IR) spectroscopy, powder x-ray diffraction (XRD), scanning electron microscopy (SEM) and vibrating sample magnetometer (VSM). Nine important parameters influencing extraction efficiency including amount of adsorbent, amounts of sample volume, pH, type and amount of extraction organic solvent, time of extraction and desorption, agitation rate and ionic strength of extraction medium, were studied and optimized. Under optimized extraction conditions, a good linearity was observed in the concentration range of 100-2000ng/mL for ephedrine and 100-2500ng/mL for methamphetamine. Analysis of positive urine samples was carried out by proposed method with the recovery of 98.71 and 97.87% for ephedrine and methamphetamine, respectively. The results indicated that carbon coated magnetic nanoparticles could be applied in clinical and forensic laboratories for simultaneous determination of abused drugs in urine media. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Mercury removal from coal combustion flue gas by modified fly ash.

    PubMed

    Xu, Wenqing; Wang, Hairui; Zhu, Tingyu; Kuang, Junyan; Jing, Pengfei

    2013-02-01

    Fly ash is a potential alternative to activated carbon for mercury adsorption. The effects of physicochemical properties on the mercury adsorption performance of three fly ash samples were investigated. X-ray fluorescence spectroscopy, X-ray photoelectron spectroscopy, and other methods were used to characterize the samples. Results indicate that mercury adsorption on fly ash is primarily physisorption and chemisorption. High specific surface areas and small pore diameters are beneficial to efficient mercury removal. Incompletely burned carbon is also an important factor for the improvement of mercury removal efficiency, in particular. The C-M bond, which is formed by the reaction of C and Ti, Si and other elements, may improve mercury oxidation. The samples modified with CuBr2, CuCl2 and FeCl3 showed excellent performance for Hg removal, because the chlorine in metal chlorides acts as an oxidant that promotes the conversion of elemental mercury (Hg0) into its oxidized form (Hg2+). Cu2+ and Fe3+ can also promote Hg0 oxidation as catalysts. HCl and O2 promote the adsorption of Hg by modified fly ash, whereas SO2 inhibits the Hg adsorption because of competitive adsorption for active sites. Fly ash samples modified with CuBr2, CuCl2 and FeCl3 are therefore promising materials for controlling mercury emissions.

  9. Efficient removal of recalcitrant deep-ocean dissolved organic matter during hydrothermal circulation

    NASA Astrophysics Data System (ADS)

    Hawkes, Jeffrey A.; Rossel, Pamela E.; Stubbins, Aron; Butterfield, David; Connelly, Douglas P.; Achterberg, Eric P.; Koschinsky, Andrea; Chavagnac, Valérie; Hansen, Christian T.; Bach, Wolfgang; Dittmar, Thorsten

    2015-11-01

    Oceanic dissolved organic carbon (DOC) is an important carbon pool, similar in magnitude to atmospheric CO2, but the fate of its oldest forms is not well understood. Hot hydrothermal circulation may facilitate the degradation of otherwise un-reactive dissolved organic matter, playing an important role in the long-term global carbon cycle. The oldest, most recalcitrant forms of DOC, which make up most of oceanic DOC, can be recovered by solid-phase extraction. Here we present measurements of solid-phase extractable DOC from samples collected between 2009 and 2013 at seven vent sites in the Atlantic, Pacific and Southern oceans, along with magnesium concentrations, a conservative tracer of water circulation through hydrothermal systems. We find that magnesium and solid-phase extractable DOC concentrations are correlated, suggesting that solid-phase extractable DOC is almost entirely lost from solution through mineralization or deposition during circulation through hydrothermal vents with fluid temperatures of 212-401 °C. In laboratory experiments, where we heated samples to 380 °C for four days, we found a similar removal efficiency. We conclude that thermal degradation alone can account for the loss of solid-phase extractable DOC in natural hydrothermal systems, and that its maximum lifetime is constrained by the timescale of hydrothermal cycling, at about 40 million years.

  10. Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.

    2015-01-01

    The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less

  11. A surrogate-based sensitivity quantification and Bayesian inversion of a regional groundwater flow model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor

    2018-02-01

    Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.

  12. Efficiency of RNA extraction from selected bacteria in the context of biogas production and metatranscriptomics.

    PubMed

    Stark, Lucy; Giersch, Tina; Wünschiers, Röbbe

    2014-10-01

    Understanding the microbial population in anaerobic digestion is an essential task to increase efficient substrate use and process stability. The metabolic state, represented e.g. by the transcriptome, of a fermenting system can help to find markers for monitoring industrial biogas production to prevent failures or to model the whole process. Advances in next-generation sequencing make transcriptomes accessible for large-scale analyses. In order to analyze the metatranscriptome of a mixed-species sample, isolation of high-quality RNA is the first step. However, different extraction methods may yield different efficiencies in different species. Especially in mixed-species environmental samples, unbiased isolation of transcripts is important for meaningful conclusions. We applied five different RNA-extraction protocols to nine taxonomic diverse bacterial species. Chosen methods are based on various lysis and extraction principles. We found that the extraction efficiency of different methods depends strongly on the target organism. RNA isolation of gram-positive bacteria was characterized by low yield whilst from gram-negative species higher concentrations can be obtained. Transferring our results to mixed-species investigations, such as metatranscriptomics with biofilms or biogas plants, leads to the conclusion that particular microorganisms might be over- or underrepresented depending on the method applied. Special care must be taken when using such metatranscriptomics data for, e.g. process modeling. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Improved Estimation and Interpretation of Correlations in Neural Circuits

    PubMed Central

    Yatsenko, Dimitri; Josić, Krešimir; Ecker, Alexander S.; Froudarakis, Emmanouil; Cotton, R. James; Tolias, Andreas S.

    2015-01-01

    Ambitious projects aim to record the activity of ever larger and denser neuronal populations in vivo. Correlations in neural activity measured in such recordings can reveal important aspects of neural circuit organization. However, estimating and interpreting large correlation matrices is statistically challenging. Estimation can be improved by regularization, i.e. by imposing a structure on the estimate. The amount of improvement depends on how closely the assumed structure represents dependencies in the data. Therefore, the selection of the most efficient correlation matrix estimator for a given neural circuit must be determined empirically. Importantly, the identity and structure of the most efficient estimator informs about the types of dominant dependencies governing the system. We sought statistically efficient estimators of neural correlation matrices in recordings from large, dense groups of cortical neurons. Using fast 3D random-access laser scanning microscopy of calcium signals, we recorded the activity of nearly every neuron in volumes 200 μm wide and 100 μm deep (150–350 cells) in mouse visual cortex. We hypothesized that in these densely sampled recordings, the correlation matrix should be best modeled as the combination of a sparse graph of pairwise partial correlations representing local interactions and a low-rank component representing common fluctuations and external inputs. Indeed, in cross-validation tests, the covariance matrix estimator with this structure consistently outperformed other regularized estimators. The sparse component of the estimate defined a graph of interactions. These interactions reflected the physical distances and orientation tuning properties of cells: The density of positive ‘excitatory’ interactions decreased rapidly with geometric distances and with differences in orientation preference whereas negative ‘inhibitory’ interactions were less selective. Because of its superior performance, this ‘sparse+latent’ estimator likely provides a more physiologically relevant representation of the functional connectivity in densely sampled recordings than the sample correlation matrix. PMID:25826696

  14. Hospital electronic medical record enterprise application strategies: do they matter?

    PubMed

    Fareed, Naleef; Ozcan, Yasar A; DeShazo, Jonathan P

    2012-01-01

    Successful implementations and the ability to reap the benefits of electronic medical record (EMR) systems may be correlated with the type of enterprise application strategy that an administrator chooses when acquiring an EMR system. Moreover, identifying the most optimal enterprise application strategy is a task that may have important linkages with hospital performance. This study explored whether hospitals that have adopted differential EMR enterprise application strategies concomitantly differ in their overall efficiency. Specifically, the study examined whether hospitals with a single-vendor strategy had a higher likelihood of being efficient than those with a best-of-breed strategy and whether hospitals with a best-of-suite strategy had a higher probability of being efficient than those with best-of-breed or single-vendor strategies. A conceptual framework was used to formulate testable hypotheses. A retrospective cross-sectional approach using data envelopment analysis was used to obtain efficiency scores of hospitals by EMR enterprise application strategy. A Tobit regression analysis was then used to determine the probability of a hospital being inefficient as related to its EMR enterprise application strategy, while moderating for the hospital's EMR "implementation status" and controlling for hospital and market characteristics. The data envelopment analysis of hospitals suggested that only 32 hospitals were efficient in the study's sample of 2,171 hospitals. The results from the post hoc analysis showed partial support for the hypothesis that hospitals with a best-of-suite strategy were more likely to be efficient than those with a single-vendor strategy. This study underscores the importance of understanding the differences between the three strategies discussed in this article. On the basis of the findings, hospital administrators should consider the efficiency associations that a specific strategy may have compared with another prior to moving toward an enterprise application strategy.

  15. Room temperature continuous wave, monolithic tunable THz sources based on highly efficient mid-infrared quantum cascade lasers

    PubMed Central

    Lu, Quanyong; Wu, Donghai; Sengupta, Saumya; Slivken, Steven; Razeghi, Manijeh

    2016-01-01

    A compact, high power, room temperature continuous wave terahertz source emitting in a wide frequency range (ν ~ 1–5 THz) is of great importance to terahertz system development for applications in spectroscopy, communication, sensing, and imaging. Here, we present a strong-coupled strain-balanced quantum cascade laser design for efficient THz generation based on intracavity difference frequency generation. Room temperature continuous wave emission at 3.41 THz with a side-mode suppression ratio of 30 dB and output power up to 14 μW is achieved with a wall-plug efficiency about one order of magnitude higher than previous demonstrations. With this highly efficient design, continuous wave, single mode THz emissions with a wide frequency tuning range of 2.06–4.35 THz and an output power up to 4.2 μW are demonstrated at room temperature from two monolithic three-section sampled grating distributed feedback-distributed Bragg reflector lasers. PMID:27009375

  16. Pneumatic jigging: Influence of operating parameters on separation efficiency of solid waste materials.

    PubMed

    Abd Aziz, Mohd Aizudin; Md Isa, Khairuddin; Ab Rashid, Radzuwan

    2017-06-01

    This article aims to provide insights into the factors that contribute to the separation efficiency of solid particles. In this study, a pneumatic jigging technique was used to assess the separation of solid waste materials that consisted of copper, glass and rubber insulator. Several initial experiments were carried out to evaluate the strengths and limitations of the technique. It is found that despite some limitations of the technique, all the samples prepared for the experiments were successfully separated. The follow-up experiments were then carried out to further assess the separation of copper wire and rubber insulator. The effects of air flow and pulse rates on the separation process were examined. The data for these follow-up experiments were analysed using a sink float analysis technique. The analysis shows that the air flow rate was very important in determining the separation efficiency. However, the separation efficiency may be influenced by the type of materials used.

  17. Effective bioleaching of chromium in tannery sludge with an enriched sulfur-oxidizing bacterial community.

    PubMed

    Zeng, Jing; Gou, Min; Tang, Yue-Qin; Li, Guo-Ying; Sun, Zhao-Yong; Kida, Kenji

    2016-10-01

    In this study, a sulfur-oxidizing community was enriched from activated sludge generated in tannery wastewater treatment plants. Bioleaching of tannery sludge containing 0.9-1.2% chromium was investigated to evaluate the effectiveness of the enriched community, the effect of chromium binding forms on bioleaching efficiency, and the dominant microbes contributing to chromium bioleaching. Sludge samples inoculated with the enriched community presented 79.9-96.8% of chromium leaching efficiencies, much higher than those without the enriched community. High bioleaching efficiencies of over 95% were achieved for chromium in reducible fraction, while 60.9-97.9% were observed for chromium in oxidizable and residual fractions. Acidithiobacillus thiooxidans, the predominant bacteria in the enriched community, played an important role in bioleaching, whereas some indigenous heterotrophic species in sludge might have had a supporting role. The results indicated that A. thiooxidans-dominant enriched microbial community had high chromium bioleaching efficiency, and chromium binding forms affected the bioleaching performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Room temperature continuous wave, monolithic tunable THz sources based on highly efficient mid-infrared quantum cascade lasers.

    PubMed

    Lu, Quanyong; Wu, Donghai; Sengupta, Saumya; Slivken, Steven; Razeghi, Manijeh

    2016-03-24

    A compact, high power, room temperature continuous wave terahertz source emitting in a wide frequency range (ν~1-5 THz) is of great importance to terahertz system development for applications in spectroscopy, communication, sensing, and imaging. Here, we present a strong-coupled strain-balanced quantum cascade laser design for efficient THz generation based on intracavity difference frequency generation. Room temperature continuous wave emission at 3.41 THz with a side-mode suppression ratio of 30 dB and output power up to 14 μW is achieved with a wall-plug efficiency about one order of magnitude higher than previous demonstrations. With this highly efficient design, continuous wave, single mode THz emissions with a wide frequency tuning range of 2.06-4.35 THz and an output power up to 4.2 μW are demonstrated at room temperature from two monolithic three-section sampled grating distributed feedback-distributed Bragg reflector lasers.

  19. Enhanced conformational sampling using replica exchange with concurrent solute scaling and hamiltonian biasing realized in one dimension.

    PubMed

    Yang, Mingjun; Huang, Jing; MacKerell, Alexander D

    2015-06-09

    Replica exchange (REX) is a powerful computational tool for overcoming the quasi-ergodic sampling problem of complex molecular systems. Recently, several multidimensional extensions of this method have been developed to realize exchanges in both temperature and biasing potential space or the use of multiple biasing potentials to improve sampling efficiency. However, increased computational cost due to the multidimensionality of exchanges becomes challenging for use on complex systems under explicit solvent conditions. In this study, we develop a one-dimensional (1D) REX algorithm to concurrently combine the advantages of overall enhanced sampling from Hamiltonian solute scaling and the specific enhancement of collective variables using Hamiltonian biasing potentials. In the present Hamiltonian replica exchange method, termed HREST-BP, Hamiltonian solute scaling is applied to the solute subsystem, and its interactions with the environment to enhance overall conformational transitions and biasing potentials are added along selected collective variables associated with specific conformational transitions, thereby balancing the sampling of different hierarchical degrees of freedom. The two enhanced sampling approaches are implemented concurrently allowing for the use of a small number of replicas (e.g., 6 to 8) in 1D, thus greatly reducing the computational cost in complex system simulations. The present method is applied to conformational sampling of two nitrogen-linked glycans (N-glycans) found on the HIV gp120 envelope protein. Considering the general importance of the conformational sampling problem, HREST-BP represents an efficient procedure for the study of complex saccharides, and, more generally, the method is anticipated to be of general utility for the conformational sampling in a wide range of macromolecular systems.

  20. Experimental Determination of the HPGe Spectrometer Efficiency Calibration Curves for Various Sample Geometry for Gamma Energy from 50 keV to 2000 keV

    NASA Astrophysics Data System (ADS)

    Saat, Ahmad; Hamzah, Zaini; Yusop, Mohammad Fariz; Zainal, Muhd Amiruddin

    2010-07-01

    Detection efficiency of a gamma-ray spectrometry system is dependent upon among others, energy, sample and detector geometry, volume and density of the samples. In the present study the efficiency calibration curves of newly acquired (August 2008) HPGe gamma-ray spectrometry system was carried out for four sample container geometries, namely Marinelli beaker, disc, cylindrical beaker and vial, normally used for activity determination of gamma-ray from environmental samples. Calibration standards were prepared by using known amount of analytical grade uranium trioxide ore, homogenized in plain flour into the respective containers. The ore produces gamma-rays of energy ranging from 53 keV to 1001 keV. Analytical grade potassium chloride were prepared to determine detection efficiency of 1460 keV gamma-ray emitted by potassium isotope K-40. Plots of detection efficiency against gamma-ray energy for the four sample geometries were found to fit smoothly to a general form of ɛ = AΕa+BΕb, where ɛ is efficiency, Ε is energy in keV, A, B, a and b are constants that are dependent on the sample geometries. All calibration curves showed the presence of a "knee" at about 180 keV. Comparison between the four geometries showed that the efficiency of Marinelli beaker is higher than cylindrical beaker and vial, while cylindrical disk showed the lowest.

  1. TemperSAT: A new efficient fair-sampling random k-SAT solver

    NASA Astrophysics Data System (ADS)

    Fang, Chao; Zhu, Zheng; Katzgraber, Helmut G.

    The set membership problem is of great importance to many applications and, in particular, database searches for target groups. Recently, an approach to speed up set membership searches based on the NP-hard constraint-satisfaction problem (random k-SAT) has been developed. However, the bottleneck of the approach lies in finding the solution to a large SAT formula efficiently and, in particular, a large number of independent solutions is needed to reduce the probability of false positives. Unfortunately, traditional random k-SAT solvers such as WalkSAT are biased when seeking solutions to the Boolean formulas. By porting parallel tempering Monte Carlo to the sampling of binary optimization problems, we introduce a new algorithm (TemperSAT) whose performance is comparable to current state-of-the-art SAT solvers for large k with the added benefit that theoretically it can find many independent solutions quickly. We illustrate our results by comparing to the currently fastest implementation of WalkSAT, WalkSATlm.

  2. Choosing experiments to accelerate collective discovery

    PubMed Central

    Rzhetsky, Andrey; Foster, Jacob G.; Foster, Ian T.

    2015-01-01

    A scientist’s choice of research problem affects his or her personal career trajectory. Scientists’ combined choices affect the direction and efficiency of scientific discovery as a whole. In this paper, we infer preferences that shape problem selection from patterns of published findings and then quantify their efficiency. We represent research problems as links between scientific entities in a knowledge network. We then build a generative model of discovery informed by qualitative research on scientific problem selection. We map salient features from this literature to key network properties: an entity’s importance corresponds to its degree centrality, and a problem’s difficulty corresponds to the network distance it spans. Drawing on millions of papers and patents published over 30 years, we use this model to infer the typical research strategy used to explore chemical relationships in biomedicine. This strategy generates conservative research choices focused on building up knowledge around important molecules. These choices become more conservative over time. The observed strategy is efficient for initial exploration of the network and supports scientific careers that require steady output, but is inefficient for science as a whole. Through supercomputer experiments on a sample of the network, we study thousands of alternatives and identify strategies much more efficient at exploring mature knowledge networks. We find that increased risk-taking and the publication of experimental failures would substantially improve the speed of discovery. We consider institutional shifts in grant making, evaluation, and publication that would help realize these efficiencies. PMID:26554009

  3. Column-coupling strategies for multidimensional electrophoretic separation techniques.

    PubMed

    Kler, Pablo A; Sydes, Daniel; Huhn, Carolin

    2015-01-01

    Multidimensional electrophoretic separations represent one of the most common strategies for dealing with the analysis of complex samples. In recent years we have been witnessing the explosive growth of separation techniques for the analysis of complex samples in applications ranging from life sciences to industry. In this sense, electrophoretic separations offer several strategic advantages such as excellent separation efficiency, different methods with a broad range of separation mechanisms, and low liquid consumption generating less waste effluents and lower costs per analysis, among others. Despite their impressive separation efficiency, multidimensional electrophoretic separations present some drawbacks that have delayed their extensive use: the volumes of the columns, and consequently of the injected sample, are significantly smaller compared to other analytical techniques, thus the coupling interfaces between two separations components must be very efficient in terms of providing geometrical precision with low dead volume. Likewise, very sensitive detection systems are required. Additionally, in electrophoretic separation techniques, the surface properties of the columns play a fundamental role for electroosmosis as well as the unwanted adsorption of proteins or other complex biomolecules. In this sense the requirements for an efficient coupling for electrophoretic separation techniques involve several aspects related to microfluidics and physicochemical interactions of the electrolyte solutions and the solid capillary walls. It is interesting to see how these multidimensional electrophoretic separation techniques have been used jointly with different detection techniques, for intermediate detection as well as for final identification and quantification, particularly important in the case of mass spectrometry. In this work we present a critical review about the different strategies for coupling two or more electrophoretic separation techniques and the different intermediate and final detection methods implemented for such separations.

  4. Extension of latin hypercube samples with correlated variables.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hora, Stephen Curtis; Helton, Jon Craig; Sallaberry, Cedric J. PhD.

    2006-11-01

    A procedure for extending the size of a Latin hypercube sample (LHS) with rank correlated variables is described and illustrated. The extension procedure starts with an LHS of size m and associated rank correlation matrix C and constructs a new LHS of size 2m that contains the elements of the original LHS and has a rank correlation matrix that is close to the original rank correlation matrix C. The procedure is intended for use in conjunction with uncertainty and sensitivity analysis of computationally demanding models in which it is important to make efficient use of a necessarily limited number ofmore » model evaluations.« less

  5. Adhesive quality inspection of wind rotor blades using thermography

    NASA Astrophysics Data System (ADS)

    Li, Xiaoli; Sun, Jiangang; Shen, Jingling; Wang, Xun; Zhang, Cunlin; Zhao, Yuejin

    2018-04-01

    Wind power is playing an increasingly important role in ensuring electrical safety for human beings. Because wind rotor blades are getting larger and larger in order to harvest wind energy more efficiently, there is a growing demand for nondestructive testing. Due to the glue structure of rotor blades, adhesive quality evaluation is needed. In this study, three adhesive samples with a wall thickness of 13mm, 28mm or 31mm were each designed with a different adhesive situation. The transmission thermography was applied to inspect the samples. The results illustrate that this method is effective to inspect adhesive quality of wind rotor blades.

  6. A tree-like Bayesian structure learning algorithm for small-sample datasets from complex biological model systems.

    PubMed

    Yin, Weiwei; Garimalla, Swetha; Moreno, Alberto; Galinski, Mary R; Styczynski, Mark P

    2015-08-28

    There are increasing efforts to bring high-throughput systems biology techniques to bear on complex animal model systems, often with a goal of learning about underlying regulatory network structures (e.g., gene regulatory networks). However, complex animal model systems typically have significant limitations on cohort sizes, number of samples, and the ability to perform follow-up and validation experiments. These constraints are particularly problematic for many current network learning approaches, which require large numbers of samples and may predict many more regulatory relationships than actually exist. Here, we test the idea that by leveraging the accuracy and efficiency of classifiers, we can construct high-quality networks that capture important interactions between variables in datasets with few samples. We start from a previously-developed tree-like Bayesian classifier and generalize its network learning approach to allow for arbitrary depth and complexity of tree-like networks. Using four diverse sample networks, we demonstrate that this approach performs consistently better at low sample sizes than the Sparse Candidate Algorithm, a representative approach for comparison because it is known to generate Bayesian networks with high positive predictive value. We develop and demonstrate a resampling-based approach to enable the identification of a viable root for the learned tree-like network, important for cases where the root of a network is not known a priori. We also develop and demonstrate an integrated resampling-based approach to the reduction of variable space for the learning of the network. Finally, we demonstrate the utility of this approach via the analysis of a transcriptional dataset of a malaria challenge in a non-human primate model system, Macaca mulatta, suggesting the potential to capture indicators of the earliest stages of cellular differentiation during leukopoiesis. We demonstrate that by starting from effective and efficient approaches for creating classifiers, we can identify interesting tree-like network structures with significant ability to capture the relationships in the training data. This approach represents a promising strategy for inferring networks with high positive predictive value under the constraint of small numbers of samples, meeting a need that will only continue to grow as more high-throughput studies are applied to complex model systems.

  7. Design and evaluation of a nondestructive fissile assay device for HTGR fuel samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeany, S. R.; Knoll, R. W.; Jenkins, J. D.

    1979-02-01

    Nondestructive assay of fissile material plays an important role in nuclear fuel processing facilities. Information for product quality control, plant criticality safety, and nuclear materials accountability can be obtained from assay devices. All of this is necessary for a safe, efficient, and orderly operation of a production plant. Presented here is a design description and an operational evaluation of a device developed to nondestructively assay small samples of High-Temperature Gas-Cooled Reactor (HTGR) fuel. The measurement technique employed consists in thermal-neutron irradiation of a sample followed by pneumatic transfer to a high-efficiency neutron detector where delayed neutrons are counted. In general,more » samples undergo several irradiation and count cycles during a measurement. The total number of delayed-neutron counts accumulated is translated into grams of fissile mass through comparison with the counts accumulated in an identical irradiation and count sequence of calibration standards. Successful operation of the device through many experiments over a one-year period indicates high operational reliability. Tests of assay precision show this to be better than 0.25% for measurements of 10 min. Assay biases may be encountered if calibration standards are not representative of unknown samples, but reasonable care in construction and control of standards should lead to no more than 0.2% bias in the measurements. Nondestructive fissile assay of HTGR fuel samples by thermal-neutron irradiation and delayed-neutron detection has been demonstrated to be a rapid and accurate analysis technique. However, careful attention and control must be given to calibration standards to see that they remain representative of unknown samples.« less

  8. Detection of plant-based adulterants in turmeric powder using DNA barcoding.

    PubMed

    Parvathy, V A; Swetha, V P; Sheeja, T E; Sasikumar, B

    2015-01-01

    In its powdered form, turmeric [Curcuma longa L. (Zingiberaceae)], a spice of medical importance, is often adulterated lowering its quality. The study sought to detect plant-based adulterants in traded turmeric powder using DNA barcoding. Accessions of Curcuma longa L., Curcuma zedoaria Rosc. (Zingiberaceae), and cassava starch served as reference samples. Three barcoding loci, namely ITS, rbcL, and matK, were used for PCR amplification of the reference samples and commercial samples representing 10 different companies. PCR success rate, sequencing efficiency, occurrence of SNPs, and BLAST analysis were used to assess the potential of the barcoding loci in authenticating the traded samples of turmeric. The PCR and sequencing success of the loci rbcL and ITS were found to be 100%, whereas matK showed no amplification. ITS proved to be the ideal locus because it showed greater variability than rbcL in discriminating the Curcuma species. The presence of C. zedoaria could be detected in one of the samples whereas cassava starch, wheat, barley, and rye in other two samples although the label claimed nothing other than turmeric powder in the samples. Unlabeled materials in turmeric powder are considered as adulterants or fillers, added to increase the bulk weight and starch content of the commodity for economic gains. These adulterants pose potential health hazards to consumers who are allergic to these plants, lowering the product's medicinal value and belying the claim that the product is gluten free. The study proved DNA barcoding as an efficient tool for testing the integrity and the authenticity of commercial products of turmeric.

  9. Using auxiliary information to improve wildlife disease surveillance when infected animals are not detected: a Bayesian approach

    USGS Publications Warehouse

    Heisey, Dennis M.; Jennelle, Christopher S.; Russell, Robin E.; Walsh, Daniel P.

    2014-01-01

    There are numerous situations in which it is important to determine whether a particular disease of interest is present in a free-ranging wildlife population. However adequate disease surveillance can be labor-intensive and expensive and thus there is substantial motivation to conduct it as efficiently as possible. Surveillance is often based on the assumption of a simple random sample, but this can almost always be improved upon if there is auxiliary information available about disease risk factors. We present a Bayesian approach to disease surveillance when auxiliary risk information is available which will usually allow for substantial improvements over simple random sampling. Others have employed risk weights in surveillance, but this can result in overly optimistic statements regarding freedom from disease due to not accounting for the uncertainty in the auxiliary information; our approach remedies this. We compare our Bayesian approach to a published example of risk weights applied to chronic wasting disease in deer in Colorado, and we also present calculations to examine when uncertainty in the auxiliary information has a serious impact on the risk weights approach. Our approach allows “apples-to-apples” comparisons of surveillance efficiencies between units where heterogeneous samples were collected

  10. Application of adaptive cluster sampling to low-density populations of freshwater mussels

    USGS Publications Warehouse

    Smith, D.R.; Villella, R.F.; Lemarie, D.P.

    2003-01-01

    Freshwater mussels appear to be promising candidates for adaptive cluster sampling because they are benthic macroinvertebrates that cluster spatially and are frequently found at low densities. We applied adaptive cluster sampling to estimate density of freshwater mussels at 24 sites along the Cacapon River, WV, where a preliminary timed search indicated that mussels were present at low density. Adaptive cluster sampling increased yield of individual mussels and detection of uncommon species; however, it did not improve precision of density estimates. Because finding uncommon species, collecting individuals of those species, and estimating their densities are important conservation activities, additional research is warranted on application of adaptive cluster sampling to freshwater mussels. However, at this time we do not recommend routine application of adaptive cluster sampling to freshwater mussel populations. The ultimate, and currently unanswered, question is how to tell when adaptive cluster sampling should be used, i.e., when is a population sufficiently rare and clustered for adaptive cluster sampling to be efficient and practical? A cost-effective procedure needs to be developed to identify biological populations for which adaptive cluster sampling is appropriate.

  11. Measurement of Biocolloid Collision Efficiencies for Granular Activated Carbon by Use of a Two-Layer Filtration Model

    PubMed Central

    Paramonova, Ekaterina; Zerfoss, Erica L.; Logan, Bruce E.

    2006-01-01

    Point-of-use filters containing granular activated carbon (GAC) are an effective method for removing certain chemicals from water, but their ability to remove bacteria and viruses has been relatively untested. Collision efficiencies (α) were determined using clean-bed filtration theory for two bacteria (Raoutella terrigena 33257 and Escherichia coli 25922), a bacteriophage (MS2), and latex microspheres for four GAC samples. These GAC samples had particle size distributions that were bimodal, but only a single particle diameter can be used in the filtration equation. Therefore, consistent with previous reports, we used a particle diameter based on the smallest diameter of the particles (derived from the projected areas of 10% of the smallest particles). The bacterial collision efficiencies calculated using the filtration model were high (0.8 ≤ α ≤ 4.9), indicating that GAC was an effective capture material. Collision efficiencies greater than unity reflect an underestimation of the collision frequency, likely as a result of particle roughness and wide GAC size distributions. The collision efficiencies for microspheres (0.7 ≤ α ≤ 3.5) were similar to those obtained for bacteria, suggesting that the microspheres were a reasonable surrogate for the bacteria. The bacteriophage collision efficiencies ranged from ≥0.2 to ≤0.4. The predicted levels of removal for 1-cm-thick carbon beds ranged from 0.8 to 3 log for the bacteria and from 0.3 to 1.0 log for the phage. These tests demonstrated that GAC can be an effective material for removal of bacteria and phage and that GAC particle size is a more important factor than relative stickiness for effective particle removal. PMID:16885264

  12. A Simulation Approach to Assessing Sampling Strategies for Insect Pests: An Example with the Balsam Gall Midge

    PubMed Central

    Carleton, R. Drew; Heard, Stephen B.; Silk, Peter J.

    2013-01-01

    Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with “pre-sampling” data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex) attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n∼100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand) was the most efficient, with sample means converging on true mean density for sample sizes of n∼25–40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods. PMID:24376556

  13. Water accounting implementation: water footprint and water efficiency of the coffee shop in Indonesia

    NASA Astrophysics Data System (ADS)

    Hendratno, S. P.; Agustine, Y.

    2018-01-01

    The purpose of this paper is for understand the water accounting practice in the company, especially beverage industry in Indonesia. The sample in this study is one coffee shop near Jakarta. Case study has been choosen as the method in this study. We collect data with semi-structured interview, observation, and survey about the water efficiency in the coffee shop. The operational officers such as barista, cashier, supervisor, and store manager are the respondents in this study. Operational management already understand about the importance of water efficiency in the coffee shop operation, but it can’t be implemented because their standard operation haven’t use the water efficiency as part of their procedures. The coffee shop’s operational standard in cleaning always takes much time and use so much water. The cleaning itself takes one until two hours each day only for cleaning bar and all operational equipment. This paper is for understand the water efficiency in the coffee shop with the focus is in their water footprint, operational standard that used every day in the coffee shop, and the connection between operational standard and the water efficiency.

  14. Negative Association of Hospital Efficiency Under Increasing Geographic Elevation on Acute Myocardial Infarction In-Patient Mortality.

    PubMed

    Devaraj, Srikant; Patel, Pankaj C

    Although variation in-patient outcomes based on hospitals' geographic location has been studied, altitude of hospitals above sea level may also affect patient outcomes. Possibly, because of negative physical and psychological effects of altitude on hospital employees, hospital efficiency may decline at higher altitudes. Greater focus on hospital efficiency, despite decreasing efficiency at higher altitudes, could increase demands on hospital employees and further deteriorate patient outcomes. Using data envelopment analysis on a sample of 840 hospital-year observations representing 95,504 patients with acute myocardial infarction (AMI) in the United States, and controlling for patient, hospital, and county characteristics and controlling for hospital, state, and year fixed effects, we find support for the negative association between hospital altitude and efficiency; for 1 percentage point increase in efficiency and every 1,000 feet increase in altitude above the sea level, the mortality of patients with AMI increases by 0.66 percentage points. The findings have implications for hospital performance at increasing geographic elevation and introduces to the literature the notion of "health economics of elevation," to suggest that elevation of a hospital may be an important criterion for consideration for policy makers and insurance firms.

  15. Effects of the Discharge Parameters on the Efficiency and Stability of Ambient Metastable-Induced Desorption Ionization

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaotian; Chen, Chilai; Liu, Youjiang; Wang, Hongwei; Zhang, Lehua; Kong, Deyi; Mario, Chavarria

    2015-12-01

    Ionization efficiency is an important factor for ion sources in mass spectrometry and ion mobility spectrometry. Using helium as the discharge gas, acetone as the sample, and high-field asymmetric ion mobility spectrometry (FAIMS) as the ion detection method, this work investigates in detail the effects of discharge parameters on the efficiency of ambient metastable-induced desorption ionization (AMDI) at atmospheric pressure. The results indicate that the discharge power and gas flow rate are both significantly correlated with the ionization efficiency. Specifically, an increase in the applied discharge power leads to a rapid increase in the ionization efficiency, which gradually reaches equilibrium due to ion saturation. Moreover, when the discharge voltage is fixed at 2.1 kV, a maximum efficiency can be achieved at the flow rate of 9.0 m/s. This study provides a foundation for the design and application of AMDI for on-line detection with mass spectrometry and ion mobility spectrometry. supported by National Natural Science Foundation of China (No. 61374016), the Changzhou Science and Technology Support Program, China (No. CE20120081) and the External Cooperation Program of Chinese Academy of Sciences (No. GJHZ1218)

  16. Technical efficiency of peripheral health units in Pujehun district of Sierra Leone: a DEA application

    PubMed Central

    Renner, Ade; Kirigia, Joses M; Zere, Eyob A; Barry, Saidou P; Kirigia, Doris G; Kamara, Clifford; Muthuri, Lenity HK

    2005-01-01

    Background The Data Envelopment Analysis (DEA) method has been fruitfully used in many countries in Asia, Europe and North America to shed light on the efficiency of health facilities and programmes. There is, however, a dearth of such studies in countries in sub-Saharan Africa. Since hospitals and health centres are important instruments in the efforts to scale up pro-poor cost-effective interventions aimed at achieving the United Nations Millennium Development Goals, decision-makers need to ensure that these health facilities provide efficient services. The objective of this study was to measure the technical efficiency (TE) and scale efficiency (SE) of a sample of public peripheral health units (PHUs) in Sierra Leone. Methods This study applied the Data Envelopment Analysis approach to investigate the TE and SE among a sample of 37 PHUs in Sierra Leone. Results Twenty-two (59%) of the 37 health units analysed were found to be technically inefficient, with an average score of 63% (standard deviation = 18%). On the other hand, 24 (65%) health units were found to be scale inefficient, with an average scale efficiency score of 72% (standard deviation = 17%). Conclusion It is concluded that with the existing high levels of pure technical and scale inefficiency, scaling up of interventions to achieve both global and regional targets such as the MDG and Abuja health targets becomes far-fetched. In a country with per capita expenditure on health of about US$7, and with only 30% of its population having access to health services, it is demonstrated that efficiency savings can significantly augment the government's initiatives to cater for the unmet health care needs of the population. Therefore, we strongly recommend that Sierra Leone and all other countries in the Region should institutionalise health facility efficiency monitoring at the Ministry of Health headquarter (MoH/HQ) and at each health district headquarter. PMID:16354299

  17. Porphyrin-based magnetic nanocomposites for efficient extraction of polycyclic aromatic hydrocarbons from water samples.

    PubMed

    Yu, Jing; Zhu, Shukui; Pang, Liling; Chen, Pin; Zhu, Gang-Tian

    2018-03-09

    Stable and reusable porphyrin-based magnetic nanocomposites were successfully synthesized for efficient extraction of polycyclic aromatic hydrocarbons (PAHs) from environmental water samples. Meso-Tetra (4-carboxyphenyl) porphyrin (TCPP), a kind of porphyrin, can connect the copolymer after amidation and was linked to Fe 3 O 4 @SiO 2 magnetic nanospheres via cross-coupling. Several characteristic techniques such as field emission scanning electron microscopy, transmission electron microscopy, X-ray diffraction, Fourier transform infrared spectrometry, vibrating sample magnetometry and a tensiometer were used to characterize the as-synthesized materials. The structure of the copolymer was similar to that of graphene, possessing sp 2 -conjugated carbon rings, but with an appropriate amount of delocalized π-electrons giving rise to the higher extraction efficiency for heavy PAHs without sacrificing the performance in the extraction of light PAHs. Six extraction parameters, including the TCPP:Fe 3 O 4 @SiO 2 (m:m) ratio, the amount of adsorbents, the type of desorption solvent, the desorption solvent volume, the adsorption time and the desorption time, were investigated. After the optimization of extraction conditions, a comparison of the extraction efficiency of Fe 3 O 4 @SiO 2 -TCPP and Fe 3 O 4 @SiO 2 @GO was carried out. The adsorption mechanism of TCPP to PAHs was studied by first-principles density functional theory (DFT) calculations. Combining experimental and calculated results, it was shown that the π-π stacking interaction was the main adsorption mechanism of TCPP for PAHs and that the amount of delocalized π-electrons plays an important role in the elution process. Under the optimal conditions, Fe 3 O 4 @SiO 2 -porphyrin showed good precision in intra-day (<8.9%) and inter-day (<13.0%) detection, low method detection limits (2-10 ng L -1 ), and wide linearity (10-10000 ng L -1 ). The method was applied to simultaneous analysis of 15 PAHs with acceptable recoveries, which were 71.1%-106.0% for ground water samples and 73.7%-107.1% for Yangtze River water samples, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Efficient Bayesian parameter estimation with implicit sampling and surrogate modeling for a vadose zone hydrological problem

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Pau, G. S. H.; Finsterle, S.

    2015-12-01

    Parameter inversion involves inferring the model parameter values based on sparse observations of some observables. To infer the posterior probability distributions of the parameters, Markov chain Monte Carlo (MCMC) methods are typically used. However, the large number of forward simulations needed and limited computational resources limit the complexity of the hydrological model we can use in these methods. In view of this, we studied the implicit sampling (IS) method, an efficient importance sampling technique that generates samples in the high-probability region of the posterior distribution and thus reduces the number of forward simulations that we need to run. For a pilot-point inversion of a heterogeneous permeability field based on a synthetic ponded infiltration experiment simu­lated with TOUGH2 (a subsurface modeling code), we showed that IS with linear map provides an accurate Bayesian description of the parameterized permeability field at the pilot points with just approximately 500 forward simulations. We further studied the use of surrogate models to improve the computational efficiency of parameter inversion. We implemented two reduced-order models (ROMs) for the TOUGH2 forward model. One is based on polynomial chaos expansion (PCE), of which the coefficients are obtained using the sparse Bayesian learning technique to mitigate the "curse of dimensionality" of the PCE terms. The other model is Gaussian process regression (GPR) for which different covariance, likelihood and inference models are considered. Preliminary results indicate that ROMs constructed based on the prior parameter space perform poorly. It is thus impractical to replace this hydrological model by a ROM directly in a MCMC method. However, the IS method can work with a ROM constructed for parameters in the close vicinity of the maximum a posteriori probability (MAP) estimate. We will discuss the accuracy and computational efficiency of using ROMs in the implicit sampling procedure for the hydrological problem considered. This work was supported, in part, by the U.S. Dept. of Energy under Contract No. DE-AC02-05CH11231

  19. Gaussian process surrogates for failure detection: A Bayesian experimental design approach

    NASA Astrophysics Data System (ADS)

    Wang, Hongqiao; Lin, Guang; Li, Jinglai

    2016-05-01

    An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.

  20. Annealed Importance Sampling for Neural Mass Models

    PubMed Central

    Penny, Will; Sengupta, Biswa

    2016-01-01

    Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution. PMID:26942606

  1. Sensitive and specific identification by polymerase chain reaction of Eimeria tenella and Eimeria maxima, important protozoan pathogens in laboratory avian facilities.

    PubMed

    Lee, Hyun-A; Hong, Sunhwa; Chung, Yungho; Kim, Okjin

    2011-09-01

    Eimeria tenella and Eimeria maxima are important pathogens causing intracellular protozoa infections in laboratory avian animals and are known to affect experimental results obtained from contaminated animals. This study aimed to find a fast, sensitive, and efficient protocol for the molecular identification of E. tenella and E. maxima in experimental samples using chickens as laboratory avian animals. DNA was extracted from fecal samples collected from chickens and polymerase chain reaction (PCR) analysis was employed to detect E. tenella and E. maxima from the extracted DNA. The target nucleic acid fragments were specifically amplified by PCR. Feces secreting E. tenella and E. maxima were detected by a positive PCR reaction. In this study, we were able to successfully detect E. tenella and E. maxima using the molecular diagnostic method of PCR. As such, we recommended PCR for monitoring E. tenella and E. maxima in laboratory avian facilities.

  2. Importance-sampling computation of statistical properties of coupled oscillators

    NASA Astrophysics Data System (ADS)

    Gupta, Shamik; Leitão, Jorge C.; Altmann, Eduardo G.

    2017-07-01

    We introduce and implement an importance-sampling Monte Carlo algorithm to study systems of globally coupled oscillators. Our computational method efficiently obtains estimates of the tails of the distribution of various measures of dynamical trajectories corresponding to states occurring with (exponentially) small probabilities. We demonstrate the general validity of our results by applying the method to two contrasting cases: the driven-dissipative Kuramoto model, a paradigm in the study of spontaneous synchronization; and the conservative Hamiltonian mean-field model, a prototypical system of long-range interactions. We present results for the distribution of the finite-time Lyapunov exponent and a time-averaged order parameter. Among other features, our results show most notably that the distributions exhibit a vanishing standard deviation but a skewness that is increasing in magnitude with the number of oscillators, implying that nontrivial asymmetries and states yielding rare or atypical values of the observables persist even for a large number of oscillators.

  3. Accelerated weight histogram method for exploring free energy landscapes

    NASA Astrophysics Data System (ADS)

    Lindahl, V.; Lidmar, J.; Hess, B.

    2014-07-01

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  4. Accelerated weight histogram method for exploring free energy landscapes.

    PubMed

    Lindahl, V; Lidmar, J; Hess, B

    2014-07-28

    Calculating free energies is an important and notoriously difficult task for molecular simulations. The rapid increase in computational power has made it possible to probe increasingly complex systems, yet extracting accurate free energies from these simulations remains a major challenge. Fully exploring the free energy landscape of, say, a biological macromolecule typically requires sampling large conformational changes and slow transitions. Often, the only feasible way to study such a system is to simulate it using an enhanced sampling method. The accelerated weight histogram (AWH) method is a new, efficient extended ensemble sampling technique which adaptively biases the simulation to promote exploration of the free energy landscape. The AWH method uses a probability weight histogram which allows for efficient free energy updates and results in an easy discretization procedure. A major advantage of the method is its general formulation, making it a powerful platform for developing further extensions and analyzing its relation to already existing methods. Here, we demonstrate its efficiency and general applicability by calculating the potential of mean force along a reaction coordinate for both a single dimension and multiple dimensions. We make use of a non-uniform, free energy dependent target distribution in reaction coordinate space so that computational efforts are not wasted on physically irrelevant regions. We present numerical results for molecular dynamics simulations of lithium acetate in solution and chignolin, a 10-residue long peptide that folds into a β-hairpin. We further present practical guidelines for setting up and running an AWH simulation.

  5. Graphene deposited onto aligned zinc oxide nanorods as an efficient coating for headspace solid-phase microextraction of gasoline fractions from oil samples.

    PubMed

    Wen, Congying; Li, Mengmeng; Li, Wangbo; Li, Zizhou; Duan, Wei; Li, Yulong; Zhou, Jie; Li, Xiyou; Zeng, Jingbin

    2017-12-29

    The content of gasoline fraction in oil samples is not only an important indicator of oil quality, but also an indispensable fundamental data for oil refining and processing. Before its determination, efficient preconcentration and separation of gasoline fractions from complicated matrices is essential. In this work, a thin layer of graphene (G) was deposited onto oriented ZnO nanorods (ZNRs) as a SPME coating. By this approach, the surface area of G was greatly enhanced by the aligned ZNRs, and the surface polarity of ZNRs was changed from polar to less polar, which were both beneficial for the extraction of gasoline fractions. In addition, the ZNRs were well protected by the mechanically and chemically stable G, making the coating highly durable for use. With headspace SPME (HS-SPME) mode, the G/ZNRs coating can effectively extract gasoline fractions from various oil samples, whose extraction efficiency achieved 1.5-5.4 and 2.1-8.2 times higher than those of a G and commercial 7-μm PDMS coating respectively. Coupled with GC-FID, the developed method is sensitive, simple, cost effective and easily accessible for the analysis of gasoline fractions. Moreover, the method is also feasible for the detection of gasoline markers in simulated oil-polluted water, which provides an option for the monitoring of oil spill accident. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Seasonal comparison of moss bag technique against vertical snow samples for monitoring atmospheric pollution.

    PubMed

    Salo, Hanna; Berisha, Anna-Kaisa; Mäkinen, Joni

    2016-03-01

    This is the first study seasonally applying Sphagnum papillosum moss bags and vertical snow samples for monitoring atmospheric pollution. Moss bags, exposed in January, were collected together with snow samples by early March 2012 near the Harjavalta Industrial Park in southwest Finland. Magnetic, chemical, scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDX), K-means clustering, and Tomlinson pollution load index (PLI) data showed parallel spatial trends of pollution dispersal for both materials. Results strengthen previous findings that concentrate and slag handling activities were important (dust) emission sources while the impact from Cu-Ni smelter's pipe remained secondary at closer distances. Statistically significant correlations existed between the variables of snow and moss bags. As a summary, both methods work well for sampling and are efficient pollutant accumulators. Moss bags can be used also in winter conditions and they provide more homogeneous and better controlled sampling method than snow samples. Copyright © 2015. Published by Elsevier B.V.

  7. Geostatistical modeling of riparian forest microclimate and its implications for sampling

    USGS Publications Warehouse

    Eskelson, B.N.I.; Anderson, P.D.; Hagar, J.C.; Temesgen, H.

    2011-01-01

    Predictive models of microclimate under various site conditions in forested headwater stream - riparian areas are poorly developed, and sampling designs for characterizing underlying riparian microclimate gradients are sparse. We used riparian microclimate data collected at eight headwater streams in the Oregon Coast Range to compare ordinary kriging (OK), universal kriging (UK), and kriging with external drift (KED) for point prediction of mean maximum air temperature (Tair). Several topographic and forest structure characteristics were considered as site-specific parameters. Height above stream and distance to stream were the most important covariates in the KED models, which outperformed OK and UK in terms of root mean square error. Sample patterns were optimized based on the kriging variance and the weighted means of shortest distance criterion using the simulated annealing algorithm. The optimized sample patterns outperformed systematic sample patterns in terms of mean kriging variance mainly for small sample sizes. These findings suggest methods for increasing efficiency of microclimate monitoring in riparian areas.

  8. Relative efficiency of anuran sampling methods in a restinga habitat (Jurubatiba, Rio de Janeiro, Brazil).

    PubMed

    Rocha, C F D; Van Sluys, M; Hatano, F H; Boquimpani-Freitas, L; Marra, R V; Marques, R V

    2004-11-01

    Studies on anurans in restinga habitats are few and, as a result, there is little information on which methods are more efficient for sampling them in this environment. Ten methods are usually used for sampling anuran communities in tropical and sub-tropical areas. In this study we evaluate which methods are more appropriate for this purpose in the restinga environment of Parque Nacional da Restinga de Jurubatiba. We analyzed six methods among those usually used for anuran samplings. For each method, we recorded the total amount of time spent (in min.), the number of researchers involved, and the number of species captured. We calculated a capture efficiency index (time necessary for a researcher to capture an individual frog) in order to make comparable the data obtained. Of the methods analyzed, the species inventory (9.7 min/searcher /ind.- MSI; richness = 6; abundance = 23) and the breeding site survey (9.5 MSI; richness = 4; abundance = 22) were the most efficient. The visual encounter inventory (45.0 MSI) and patch sampling (65.0 MSI) methods were of comparatively lower efficiency restinga, whereas the plot sampling and the pit-fall traps with drift-fence methods resulted in no frog capture. We conclude that there is a considerable difference in efficiency of methods used in the restinga environment and that the complete species inventory method is highly efficient for sampling frogs in the restinga studied and may be so in other restinga environments. Methods that are usually efficient in forested areas seem to be of little value in open restinga habitats.

  9. Improving Efficiency with Work Sampling.

    ERIC Educational Resources Information Center

    Friedman, Mark; Hertz, Paul

    1982-01-01

    Work sampling is a managerial accounting technique which provides information about the efficiency of an operation. This analysis determines what tasks are being performed durinq a period of time to ascertain if time and effort are being allocated efficiently. (SK)

  10. Molecular comparison of the sampling efficiency of four types of airborne bacterial samplers.

    PubMed

    Li, Kejun

    2011-11-15

    In the present study, indoor and outdoor air samples were collected using four types of air samplers often used for airborne bacterial sampling. These air samplers included two solid impactors (BioStage and RCS), one liquid impinger (BioSampler), and one filter sampler with two kinds of filters (a gelatin and a cellulose acetate filter). The collected air samples were further processed to analyze the diversity and abundance of culturable bacteria and total bacteria through standard culture techniques, denaturing gradient gel electrophoresis (DGGE) fingerprinting and quantitative polymerase chain reaction (qPCR) analysis. The DGGE analysis indicated that the air samples collected using the BioStage and RCS samplers have higher culturable bacterial diversity, whereas the samples collected using the BioSampler and the cellulose acetate filter sampler have higher total bacterial diversity. To obtain more information on the sampled bacteria, some gel bands were excised and sequenced. In terms of sampling efficiency, results from the qPCR tests indicated that the collected total bacterial concentration was higher in samples collected using the BioSampler and the cellulose acetate filter sampler. In conclusion, the sampling bias and efficiency of four kinds of air sampling systems were compared in the present study and the two solid impactors were concluded to be comparatively efficient for culturable bacterial sampling, whereas the liquid impactor and the cellulose acetate filter sampler were efficient for total bacterial sampling. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Small Sample Properties of Asymptotically Efficient Estimators of the Parameters of a Bivariate Gaussian–Weibull Distribution

    Treesearch

    Steve P. Verrill; James W. Evans; David E. Kretschmann; Cherilyn A. Hatfield

    2012-01-01

    Two important wood properties are stiffness (modulus of elasticity or MOE) and bending strength (modulus of rupture or MOR). In the past, MOE has often been modeled as a Gaussian and MOR as a lognormal or a two or three parameter Weibull. It is well known that MOE and MOR are positively correlated. To model the simultaneous behavior of MOE and MOR for the purposes of...

  12. Some practical universal noiseless coding techniques, part 2

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1983-01-01

    This report is an extension of earlier work (Part 1) which provided practical adaptive techniques for the efficient noiseless coding of a broad class of data sources characterized by only partially known and varying statistics (JPL Publication 79-22). The results here, while still claiming such general applicability, focus primarily on the noiseless coding of image data. A fairly complete and self-contained treatment is provided. Particular emphasis is given to the requirements of the forthcoming Voyager II encounters of Uranus and Neptune. Performance evaluations are supported both graphically and pictorially. Expanded definitions of the algorithms in Part 1 yield a computationally improved set of options for applications requiring efficient performance at entropies above 4 bits/sample. These expanded definitions include as an important subset, a somewhat less efficient but extremely simple "FAST' compressor which will be used at the Voyager Uranus encounter. Additionally, options are provided which enhance performance when atypical data spikes may be present.

  13. Separation of cancer cells from white blood cells by pinched flow fractionation.

    PubMed

    Pødenphant, Marie; Ashley, Neil; Koprowska, Kamila; Mir, Kalim U; Zalkovskij, Maksim; Bilenberg, Brian; Bodmer, Walter; Kristensen, Anders; Marie, Rodolphe

    2015-12-21

    In this paper, the microfluidic size-separation technique pinched flow fractionation (PFF) is used to separate cancer cells from white blood cells (WBCs). The cells are separated at efficiencies above 90% for both cell types. Circulating tumor cells (CTCs) are found in the blood of cancer patients and can form new tumors. CTCs are rare cells in blood, but they are important for the understanding of metastasis. There is therefore a high interest in developing a method for the enrichment of CTCs from blood samples, which also enables further analysis of the separated cells. The separation is challenged by the size overlap between cancer cells and the 10(6) times more abundant WBCs. The size overlap prevents high efficiency separation, however we demonstrate that cell deformability can be exploited in PFF devices to gain higher efficiencies than expected from the size distribution of the cells.

  14. Optimal dielectric and cavity configurations for improving the efficiency of electron paramagnetic resonance probes

    NASA Astrophysics Data System (ADS)

    Elnaggar, Sameh Y.; Tervo, Richard; Mattar, Saba M.

    2014-08-01

    An electron paramagnetic resonance (EPR) spectrometer’s lambda efficiency parameter (Λ) is one of the most important parameters that govern its sensitivity. It is studied for an EPR probe consisting of a dielectric resonator (DR) in a cavity (CV). Expressions for Λ are derived in terms of the probe’s individual DR and CV components, Λ1 and Λ2 respectively. Two important cases are considered. In the first, a probe consisting of a CV is improved by incorporating a DR. The sensitivity enhancement depends on the relative rather than the absolute values of the individual components. This renders the analysis general. The optimal configuration occurs when the CV and DR modes are nearly degenerate. This configuration guarantees that the probe can be easily coupled to the microwave bridge while maintaining a large Λ. It is shown that for a lossy CV with a small quality factor Q2, one chooses a DR that has the highest filling factor, η1, regardless of its Λ1 and Q1. On the other hand, if the CV has a large Q2, the optimum DR is the one which has the highest Λ1. This is regardless of its η1 and relative dielectric constant, ɛr. When the quality factors of both the CV and DR are comparable, the lambda efficiency is reduced by a factor of √{2}. Thus the signal intensity for an unsaturated sample is cut in half. The second case is the design of an optimum shield to house a DR. Besides preventing radiation leakage, it is shown that for a high loss DR, the shield can actually boost Λ above the DR value. This can also be very helpful for relatively low efficiency dielectrics as well as lossy samples, such as polar liquids.

  15. The importance of accounting for larval detectability in mosquito habitat-association studies.

    PubMed

    Low, Matthew; Tsegaye, Admasu Tassew; Ignell, Rickard; Hill, Sharon; Elleby, Rasmus; Feltelius, Vilhelm; Hopkins, Richard

    2016-05-04

    Mosquito habitat-association studies are an important basis for disease control programmes and/or vector distribution models. However, studies do not explicitly account for incomplete detection during larval presence and abundance surveys, with potential for significant biases because of environmental influences on larval behaviour and sampling efficiency. Data were used from a dip-sampling study for Anopheles larvae in Ethiopia to evaluate the effect of six factors previously associated with larval sampling (riparian vegetation, direct sunshine, algae, water depth, pH and temperature) on larval presence and detectability. Comparisons were made between: (i) a presence-absence logistic regression where samples were pooled at the site level and detectability ignored, (ii) a success versus trials binomial model, and (iii) a presence-detection mixture model that separately estimated presence and detection, and fitted different explanatory variables to these estimations. Riparian vegetation was consistently highlighted as important, strongly suggesting it explains larval presence (-). However, depending on how larval detectability was estimated, the other factors showed large variations in their statistical importance. The presence-detection mixture model provided strong evidence that larval detectability was influenced by sunshine and water temperature (+), with weaker evidence for algae (+) and water depth (-). For larval presence, there was also some evidence that water depth (-) and pH (+) influenced site occupation. The number of dip-samples needed to determine if larvae were likely present at a site was condition dependent: with sunshine and warm water requiring only two dips, while cooler water and cloud cover required 11. Environmental factors influence true larval presence and larval detectability differentially when sampling in field conditions. Researchers need to be more aware of the limitations and possible biases in different analytical approaches used to associate larval presence or abundance with local environmental conditions. These effects can be disentangled using data that are routinely collected (i.e., multiple dip samples at each site) by employing a modelling approach that separates presence from detectability.

  16. The Conformational Flexibility of the Acyltransferase from the Disorazole Polyketide Synthase Is Revealed by an X-ray Free-Electron Laser Using a Room-Temperature Sample Delivery Method for Serial Crystallography

    PubMed Central

    Allison, Kim; Robbins, Thomas; Lyubimov, Artem Y.; Uervirojnangkoorn, Monarin; Brunger, Axel T.; Khosla, Chaitan; DeMirci, Hasan; McPhillips, Scott E.; Hollenbeck, Michael; Soltis, Michael; Cohen, Aina E.

    2017-01-01

    The crystal structure of the trans-acyltrans-ferase (AT) from the disorazole polyketide synthase (PKS) was determined at room temperature to a resolution of 2.5 Å using a new method for the direct delivery of the sample into an X-ray free-electron laser. A novel sample extractor efficiently delivered limited quantities of microcrystals directly from the native crystallization solution into the X-ray beam at room temperature. The AT structure revealed important catalytic features of this core PKS enzyme, including the occurrence of conformational changes around the active site. The implications of these conformational changes for polyketide synthase reaction dynamics are discussed. PMID:28832129

  17. An efficient method and device for transfer of semisolid materials into solid-state NMR spectroscopy rotors.

    PubMed

    Hisao, Grant S; Harland, Michael A; Brown, Robert A; Berthold, Deborah A; Wilson, Thomas E; Rienstra, Chad M

    2016-04-01

    The study of mass-limited biological samples by magic angle spinning (MAS) solid-state NMR spectroscopy critically relies upon the high-yield transfer of material from a biological preparation into the MAS rotor. This issue is particularly important for maintaining biological activity and hydration of semi-solid samples such as membrane proteins in lipid bilayers, pharmaceutical formulations, microcrystalline proteins and protein fibrils. Here we present protocols and designs for rotor-packing devices specifically suited for packing hydrated samples into Pencil-style 1.6 mm, 3.2 mm standard, and 3.2 mm limited speed MAS rotors. The devices are modular and therefore readily adaptable to other rotor and/or ultracentrifugation tube geometries. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Multiscale simulations of patchy particle systems combining Molecular Dynamics, Path Sampling and Green's Function Reaction Dynamics

    NASA Astrophysics Data System (ADS)

    Bolhuis, Peter

    Important reaction-diffusion processes, such as biochemical networks in living cells, or self-assembling soft matter, span many orders in length and time scales. In these systems, the reactants' spatial dynamics at mesoscopic length and time scales of microns and seconds is coupled to the reactions between the molecules at microscopic length and time scales of nanometers and milliseconds. This wide range of length and time scales makes these systems notoriously difficult to simulate. While mean-field rate equations cannot describe such processes, the mesoscopic Green's Function Reaction Dynamics (GFRD) method enables efficient simulation at the particle level provided the microscopic dynamics can be integrated out. Yet, many processes exhibit non-trivial microscopic dynamics that can qualitatively change the macroscopic behavior, calling for an atomistic, microscopic description. The recently developed multiscale Molecular Dynamics Green's Function Reaction Dynamics (MD-GFRD) approach combines GFRD for simulating the system at the mesocopic scale where particles are far apart, with microscopic Molecular (or Brownian) Dynamics, for simulating the system at the microscopic scale where reactants are in close proximity. The association and dissociation of particles are treated with rare event path sampling techniques. I will illustrate the efficiency of this method for patchy particle systems. Replacing the microscopic regime with a Markov State Model avoids the microscopic regime completely. The MSM is then pre-computed using advanced path-sampling techniques such as multistate transition interface sampling. I illustrate this approach on patchy particle systems that show multiple modes of binding. MD-GFRD is generic, and can be used to efficiently simulate reaction-diffusion systems at the particle level, including the orientational dynamics, opening up the possibility for large-scale simulations of e.g. protein signaling networks.

  19. Application of Nanofiber-packed SPE for Determination of Urinary 1-Hydroxypyrene Level Using HPLC.

    PubMed

    Ifegwu, Okechukwu Clinton; Anyakora, Chimezie; Chigome, Samuel; Torto, Nelson

    2014-01-01

    It is always desirable to achieve maximum sample clean-up, extraction, and pre-concentration with the minimum possible organic solvent. The miniaturization of sample preparation devices was successfully demonstrated by packing 10 mg of 11 electrospun polymer nanofibers into pipette tip micro column and mini disc cartridges for efficient pre-concentration of 1-hydroxypyrene in urine samples. 1-hydroxypyrene is an extensively studied biomarker of the largest class of chemical carcinogens. Excretory 1-hydroxypyrene was monitored with HPLC/fluorescence detector. Important parameters influencing the percentage recovery such as fiber diameter, fiber packing amount, eluent, fiber packing format, eluent volume, surface area, porosity, and breakthrough parameters were thoroughly studied and optimized. Under optimized condition, there was a near perfect linearity of response in the range of 1-1000 μg/L with a coefficient of determination (r (2)) between 0.9992 and 0.9999 and precision (% RSD) ≤7.64% (n = 6) for all the analysis (10, 25, and 50 μg/L). The Limit of detection (LOD) was between 0.022 and 0.15 μg/L. When compared to the batch studies, both disc packed nanofiber sorbents and pipette tip packed sorbents exhibited evident dominance based on their efficiencies. The experimental results showed comparable absolute recoveries for the mini disc packed fibers (84% for Nylon 6) and micro columns (80% for Nylon 6), although the disc displayed slightly higher recoveries possibly due to the exposure of the analyte to a larger reacting surface. The results also showed highly comparative extraction efficiencies between the nanofibers and conventional C-18 SPE sorbent. Nevertheless, miniaturized SPE devices simplified sample preparation, reducing back pressure, time of the analysis with acceptable reliability, selectivity, detection levels, and environmental friendliness, hence promoting green chemistry.

  20. Determinants of efficiency in the provision of municipal street-cleaning and refuse collection services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benito-Lopez, Bernardino, E-mail: benitobl@um.es; Rocio Moreno-Enguix, Maria del, E-mail: mrmoreno@um.es; Solana-Ibanez, Jose, E-mail: jsolana@um.es

    Effective waste management systems can make critical contributions to public health, environmental sustainability and economic development. The challenge affects every person and institution in society, and measures cannot be undertaken without data collection and a quantitative analysis approach. In this paper, the two-stage double bootstrap procedure of is used to estimate the efficiency determinants of Spanish local entities in the provision of public street-cleaning and refuse collection services. The purpose is to identify factors that influence efficiency. The final sample comprised 1072 municipalities. In the first stage, robust efficiency estimates are obtained with Data Envelopment Analysis (DEA). We apply themore » second stage, based on a truncated-regression, to estimate the effect of a group of environmental factors on DEA estimates. The results show the existence of a significant relation between efficiency and all the variables analysed (per capita income, urban population density, the comparative index of the importance of tourism and that of the whole economic activity). We have also considered the influence of a dummy categorical variable - the political sign of the governing party - on the efficient provision of the services under study. The results from the methodology proposed show that municipalities governed by progressive parties are more efficient.« less

  1. Determinants of efficiency in the provision of municipal street-cleaning and refuse collection services.

    PubMed

    Benito-López, Bernardino; Moreno-Enguix, María del Rocio; Solana-Ibañez, José

    2011-06-01

    Effective waste management systems can make critical contributions to public health, environmental sustainability and economic development. The challenge affects every person and institution in society, and measures cannot be undertaken without data collection and a quantitative analysis approach. In this paper, the two-stage double bootstrap procedure of Simar and Wilson (2007) is used to estimate the efficiency determinants of Spanish local entities in the provision of public street-cleaning and refuse collection services. The purpose is to identify factors that influence efficiency. The final sample comprised 1072 municipalities. In the first stage, robust efficiency estimates are obtained with Data Envelopment Analysis (DEA). We apply the second stage, based on a truncated-regression, to estimate the effect of a group of environmental factors on DEA estimates. The results show the existence of a significant relation between efficiency and all the variables analysed (per capita income, urban population density, the comparative index of the importance of tourism and that of the whole economic activity). We have also considered the influence of a dummy categorical variable - the political sign of the governing party - on the efficient provision of the services under study. The results from the methodology proposed show that municipalities governed by progressive parties are more efficient. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Experimental scattershot boson sampling

    PubMed Central

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J.; Galvão, Ernesto F.; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-01-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy. PMID:26601164

  3. Experimental scattershot boson sampling.

    PubMed

    Bentivegna, Marco; Spagnolo, Nicolò; Vitelli, Chiara; Flamini, Fulvio; Viggianiello, Niko; Latmiral, Ludovico; Mataloni, Paolo; Brod, Daniel J; Galvão, Ernesto F; Crespi, Andrea; Ramponi, Roberta; Osellame, Roberto; Sciarrino, Fabio

    2015-04-01

    Boson sampling is a computational task strongly believed to be hard for classical computers, but efficiently solvable by orchestrated bosonic interference in a specialized quantum computer. Current experimental schemes, however, are still insufficient for a convincing demonstration of the advantage of quantum over classical computation. A new variation of this task, scattershot boson sampling, leads to an exponential increase in speed of the quantum device, using a larger number of photon sources based on parametric down-conversion. This is achieved by having multiple heralded single photons being sent, shot by shot, into different random input ports of the interferometer. We report the first scattershot boson sampling experiments, where six different photon-pair sources are coupled to integrated photonic circuits. We use recently proposed statistical tools to analyze our experimental data, providing strong evidence that our photonic quantum simulator works as expected. This approach represents an important leap toward a convincing experimental demonstration of the quantum computational supremacy.

  4. High Resolution Separations and Improved Ion Production and Transmission in Metabolomics

    PubMed Central

    Metz, Thomas O.; Page, Jason S.; Baker, Erin S.; Tang, Keqi; Ding, Jie; Shen, Yufeng; Smith, Richard D.

    2008-01-01

    The goal of metabolomics analyses is the detection and quantitation of as many sample components as reasonably possible in order to identify compounds or “features” that can be used to characterize the samples under study. When utilizing electrospray ionization to produce ions for analysis by mass spectrometry (MS), it is important that metabolome sample constituents be efficiently separated prior to ion production, in order to minimize ionization suppression and thereby extend the dynamic range of the measurement, as well as the coverage of the metabolome. Similarly, optimization of the MS inlet and interface can lead to increased measurement sensitivity. This perspective review will focus on the role of high resolution liquid chromatography (LC) separations in conjunction with improved ion production and transmission for LC-MS-based metabolomics. Additional emphasis will be placed on the compromise between metabolome coverage and sample analysis throughput. PMID:19255623

  5. Collection, transport and general processing of clinical specimens in Microbiology laboratory.

    PubMed

    Sánchez-Romero, M Isabel; García-Lechuz Moya, Juan Manuel; González López, Juan José; Orta Mira, Nieves

    2018-02-06

    The interpretation and the accuracy of the microbiological results still depend to a great extent on the quality of the samples and their processing within the Microbiology laboratory. The type of specimen, the appropriate time to obtain the sample, the way of sampling, the storage and transport are critical points in the diagnostic process. The availability of new laboratory techniques for unusual pathogens, makes necessary the review and update of all the steps involved in the processing of the samples. Nowadays, the laboratory automation and the availability of rapid techniques allow the precision and turn-around time necessary to help the clinicians in the decision making. In order to be efficient, it is very important to obtain clinical information to use the best diagnostic tools. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  6. Evaluation of the availability of bound analyte for passive sampling in the presence of mobile binding matrix.

    PubMed

    Xu, Jianqiao; Huang, Shuyao; Jiang, Ruifen; Cui, Shufen; Luan, Tiangang; Chen, Guosheng; Qiu, Junlang; Cao, Chenyang; Zhu, Fang; Ouyang, Gangfeng

    2016-04-21

    Elucidating the availability of the bound analytes for the mass transfer through the diffusion boundary layers (DBLs) adjacent to passive samplers is important for understanding the passive sampling kinetics in complex samples, in which the lability factor of the bound analyte in the DBL is an important parameter. In this study, the mathematical expression of lability factor was deduced by assuming a pseudo-steady state during passive sampling, and the equation indicated that the lability factor was equal to the ratio of normalized concentration gradients between the bound and free analytes. Through the introduction of the mathematical expression of lability factor, the modified effective average diffusion coefficient was proven to be more suitable for describing the passive sampling kinetics in the presence of mobile binding matrixes. Thereafter, the lability factors of the bound polycyclic aromatic hydrocarbons (PAHs) with sodium dodecylsulphate (SDS) micelles as the binding matrixes were figured out according to the improved theory. The lability factors were observed to decrease with larger binding ratios and smaller micelle sizes, and were successfully used to predict the mass transfer efficiencies of PAHs through DBLs. This study would promote the understanding of the availability of bound analytes for passive sampling based on the theoretical improvements and experimental assessments. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Relative efficiency and sample size for cluster randomized trials with variable cluster sizes.

    PubMed

    You, Zhiying; Williams, O Dale; Aban, Inmaculada; Kabagambe, Edmond Kato; Tiwari, Hemant K; Cutter, Gary

    2011-02-01

    The statistical power of cluster randomized trials depends on two sample size components, the number of clusters per group and the numbers of individuals within clusters (cluster size). Variable cluster sizes are common and this variation alone may have significant impact on study power. Previous approaches have taken this into account by either adjusting total sample size using a designated design effect or adjusting the number of clusters according to an assessment of the relative efficiency of unequal versus equal cluster sizes. This article defines a relative efficiency of unequal versus equal cluster sizes using noncentrality parameters, investigates properties of this measure, and proposes an approach for adjusting the required sample size accordingly. We focus on comparing two groups with normally distributed outcomes using t-test, and use the noncentrality parameter to define the relative efficiency of unequal versus equal cluster sizes and show that statistical power depends only on this parameter for a given number of clusters. We calculate the sample size required for an unequal cluster sizes trial to have the same power as one with equal cluster sizes. Relative efficiency based on the noncentrality parameter is straightforward to calculate and easy to interpret. It connects the required mean cluster size directly to the required sample size with equal cluster sizes. Consequently, our approach first determines the sample size requirements with equal cluster sizes for a pre-specified study power and then calculates the required mean cluster size while keeping the number of clusters unchanged. Our approach allows adjustment in mean cluster size alone or simultaneous adjustment in mean cluster size and number of clusters, and is a flexible alternative to and a useful complement to existing methods. Comparison indicated that we have defined a relative efficiency that is greater than the relative efficiency in the literature under some conditions. Our measure of relative efficiency might be less than the measure in the literature under some conditions, underestimating the relative efficiency. The relative efficiency of unequal versus equal cluster sizes defined using the noncentrality parameter suggests a sample size approach that is a flexible alternative and a useful complement to existing methods.

  8. Direct determination of quantum efficiency of semiconducting films

    DOEpatents

    Faughnan, Brian W.; Hanak, Joseph J.

    1986-01-01

    Photovoltaic quantum efficiency of semiconductor samples is determined directly, without requiring that a built-in photovoltage be generated by the sample. Electrodes are attached to the sample so as to form at least one Schottky barrier therewith. When illuminated, the generated photocurrent carriers are collected by an external bias voltage impressed across the electrodes. The generated photocurrent is measured, and photovoltaic quantum efficiency is calculated therefrom.

  9. Direct determination of quantum efficiency of semiconducting films

    DOEpatents

    Faughnan, B.W.; Hanak, J.J.

    Photovoltaic quantum efficiency of semiconductor samples is determined directly, without requiring that a built-in photovoltage be generated by the sample. Electrodes are attached to the sample so as to form at least one Schottky barrier therewith. When illuminated, the generated photocurrent carriers are collected by an external bias voltage impressed across the electrodes. The generated photocurrent is measured, and photovoltaic quantum efficiency is calculated therefrom.

  10. A Formal Messaging Notation for Alaskan Aviation Data

    NASA Technical Reports Server (NTRS)

    Rios, Joseph L.

    2015-01-01

    Data exchange is an increasingly important aspect of the National Airspace System. While many data communication channels have become more capable of sending and receiving data at higher throughput rates, there is still a need to use communication channels efficiently with limited throughput. The limitation can be based on technological issues, financial considerations, or both. This paper provides a complete description of several important aviation weather data in Abstract Syntax Notation format. By doing so, data providers can take advantage of Abstract Syntax Notation's ability to encode data in a highly compressed format. When data such as pilot weather reports, surface weather observations, and various weather predictions are compressed in such a manner, it allows for the efficient use of throughput-limited communication channels. This paper provides details on the Abstract Syntax Notation One (ASN.1) implementation for Alaskan aviation data, and demonstrates its use on real-world aviation weather data samples as Alaska has sparse terrestrial data infrastructure and data are often sent via relatively costly satellite channels.

  11. Photocatalytic degradation of textile dye using TiO2-activated carbon nanocomposite

    NASA Astrophysics Data System (ADS)

    Ghosh, Gourab; Basu, Sankhadeep; Saha, Sudeshna

    2018-05-01

    Rapid industrialisation has extended the use of dyes in various industrial applications in order to meet the escalating demands on consumer products. The toxicity level of a particular dye is very important due to its diverse effects on the environment and living organisms. Among all the techniques for dye removal, adsorption and photocatalysis are two important processes which are gaining much attention in recent years. In the present study activated carbon (adsorbent), TiO2 nanoparticles (photocatalyst) and their composite were used for dye removal. Prepared samples were characterized using standard characterization techniques such as XRD and SEM. Activated carbon was prepared from waste shells of Sterculia foetida. Mixture of activated carbon (activation temperature 600°C) and titania (calcined at 500°C) in the ratio 1:1 displayed greater dye removal efficiency than its individual components. Reusability study indicated that the mixture could effectively be used without further regeneration as very little loss in efficiency was observed after single cycle use.

  12. Chapter 1: Reliably Measuring the Performance of Emerging Photovoltaic Solar Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rumbles, Garry; Reese, Matthew O; Marshall, Ashley

    Determining the power conversion efficiency of photovoltaic solar cells, especially those from new, emerging areas of technology, is important if advances in performance are to be made. However, although precise measurements are important, it is the accuracy of these types of measurements that can cause issues. Accurate measurements not only promote the development of new technology platforms, but they also enable comparisons with established technologies and allow assessments of advancements within the same field. This chapter provides insights into how measurements can be made with reasonable accuracy using both the components of the measuring system and a good protocol tomore » acquire good data. The chapter discusses how to measure a calibrated lamp spectrum, determine a spectral mismatch factor, identify the correct reference cell and filter, define the illuminated active area, measure J-V curves to avoid any hysteresis effects, take note of sample degradation issues and avoid the temptation to artificially enhance efficiency data.« less

  13. Effects of Light Curing Method and Exposure Time on Mechanical Properties of Resin Based Dental Materials

    PubMed Central

    Alpöz, A. Riza; Ertuḡrul, Fahinur; Cogulu, Dilsah; Ak, Asli Topaloḡlu; Tanoḡlu, Metin; Kaya, Elçin

    2008-01-01

    Objectives The aim of this study was to investigate microhardness and compressive strength of composite resin (Tetric-Ceram, Ivoclar Vivadent), compomer (Compoglass, Ivoclar, Vivadent), and resin modified glass ionomer cement (Fuji II LC, GC Corp) polymerized using halogen light (Optilux 501, Demetron, Kerr) and LED (Bluephase C5, Ivoclar Vivadent) for different curing times. Methods Samples were placed in disc shaped plastic molds with uniform size of 5 mm diameter and 2 mm in thickness for surface microhardness test and placed in a diameter of 4 mm and a length of 2 mm teflon cylinders for compressive strength test. For each subgroup, 20 samples for microhardness (n=180) and 5 samples for compressive strength were prepared (n=45). In group 1, samples were polymerized using halogen light source for 40 seconds; in group 2 and 3 samples were polymerized using LED light source for 20 seconds and 40 seconds respectively. All data were analyzed by two way analysis of ANOVA and Tukey’s post-hoc tests. Results Same exposure time of 40 seconds with a low intensity LED was found similar or more efficient than a high intensity halogen light unit (P>.05), however application of LED for 20 seconds was found less efficient than 40 seconds curing time (P=.03). Conclusions It is important to increase the light curing time and use appropriate light curing devices to polymerize resin composite in deep cavities to maximize the hardness and compressive strength of restorative materials. PMID:19212507

  14. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    PubMed

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Evaluation of environmental filtration control of engineered nanoparticles using the Harvard Versatile Engineered Nanomaterial Generation System (VENGES)

    NASA Astrophysics Data System (ADS)

    Tsai, Candace S.-J.; Echevarría-Vega, Manuel E.; Sotiriou, Georgios A.; Santeufemio, Christopher; Schmidt, Daniel; Demokritou, Philip; Ellenbecker, Michael

    2012-05-01

    Applying engineering controls to airborne engineered nanoparticles (ENPs) is critical to prevent environmental releases and worker exposure. This study evaluated the effectiveness of two air sampling and six air cleaning fabric filters at collecting ENPs using industrially relevant flame-made engineered nanoparticles generated using a versatile engineered nanomaterial generation system (VENGES), recently designed and constructed at Harvard University. VENGES has the ability to generate metal and metal oxide exposure atmospheres while controlling important particle properties such as primary particle size, aerosol size distribution, and agglomeration state. For this study, amorphous SiO2 ENPs with a 15.4 nm primary particle size were generated and diluted with HEPA-filtered air. The aerosol was passed through the filter samples at two different filtration face velocities (2.3 and 3.5 m/min). Particle concentrations as a function of particle size were measured upstream and downstream of the filters using a specially designed filter test system to evaluate filtration efficiency. Real time instruments (FMPS and APS) were used to measure particle concentration for diameters from 5 to 20,000 nm. Membrane-coated fabric filters were found to have enhanced nanoparticle collection efficiency by 20-46 % points compared to non-coated fabric and could provide collection efficiency above 95 %.

  16. The chemical and oxidation characteristics of semi-dry flue gas desulfurization ash from a steel factory.

    PubMed

    Liu, Ren-ping; Guo, Bin; Ren, Ailing; Bian, Jing-feng

    2010-10-01

    Some samples of semi-dry flue gas desulfurization (FGD) ash were taken from sinter gas of a steel factory. Scanning electron microscope (SEM) and X-ray diffraction (XRD) analyses were employed to identify the samples in order to investigate their physical and chemical characteristics. The results show that semi-dry FGD ash from a steel factory is stable under atmospheric conditions. It has irregular shape, a smooth surface and loose construction. The size of FGD ash particles is around 0.5-25 µm, the average size is about 5 µm and the median diameter is 4.18 µm. Semi-dry FGD ash from a steel factory consists of CaSO₃, CaSO₄, CaCO₃, some amorphous vitreous material and unburned carbon. An experimental method was found to study the oxidation characteristics of ash. A prediction model of the oxidation efficiency was obtained based on response surface methodology. The results show that not only the temperature, but also gas:solid ratio, play an important role in influencing the oxidation efficiency. The interactions of the gas:solid ratio with temperature play an essential role. An improved response surface model was obtained which can be helpful to describe the degree of oxidation efficiency of semi-dry FGD ash.

  17. Evaluation of environmental filtration control of engineered nanoparticles using the Harvard Versatile Engineered Nanomaterial Generation System (VENGES)

    PubMed Central

    Echevarría-Vega, Manuel E.; Sotiriou, Georgios A.; Santeufemio, Christopher; Schmidt, Daniel; Demokritou, Philip; Ellenbecker, Michael

    2013-01-01

    Applying engineering controls to airborne engineered nanoparticles (ENPs) is critical to prevent environmental releases and worker exposure. This study evaluated the effectiveness of two air sampling and six air cleaning fabric filters at collecting ENPs using industrially relevant flame-made engineered nanoparticles generated using a versatile engineered nanomaterial generation system (VENGES), recently designed and constructed at Harvard University. VENGES has the ability to generate metal and metal oxide exposure atmospheres while controlling important particle properties such as primary particle size, aerosol size distribution, and agglomeration state. For this study, amorphous SiO2 ENPs with a 15.4 nm primary particle size were generated and diluted with HEPA-filtered air. The aerosol was passed through the filter samples at two different filtration face velocities (2.3 and 3.5 m/min). Particle concentrations as a function of particle size were measured upstream and downstream of the filters using a specially designed filter test system to evaluate filtration efficiency. Real time instruments (FMPS and APS) were used to measure particle concentration for diameters from 5 to 20,000 nm. Membrane-coated fabric filters were found to have enhanced nanoparticle collection efficiency by 20–46 % points compared to non-coated fabric and could provide collection efficiency above 95 %. PMID:23412707

  18. Porosity, permeability and 3D fracture network characterisation of dolomite reservoir rock samples

    PubMed Central

    Voorn, Maarten; Exner, Ulrike; Barnhoorn, Auke; Baud, Patrick; Reuschlé, Thierry

    2015-01-01

    With fractured rocks making up an important part of hydrocarbon reservoirs worldwide, detailed analysis of fractures and fracture networks is essential. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) however suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this paper, we therefore explore the use of an additional method – non-destructive 3D X-ray micro-Computed Tomography (μCT) – to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. We process the 3D μCT data in this study by a Hessian-based fracture filtering routine and can successfully extract porosity, fracture aperture, fracture density and fracture orientations – in bulk as well as locally. Additionally, thin sections made from selected plug samples provide 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) towards more realistic reservoir conditions. This study shows that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that although there are limitations, several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner. PMID:26549935

  19. Porosity, permeability and 3D fracture network characterisation of dolomite reservoir rock samples.

    PubMed

    Voorn, Maarten; Exner, Ulrike; Barnhoorn, Auke; Baud, Patrick; Reuschlé, Thierry

    2015-03-01

    With fractured rocks making up an important part of hydrocarbon reservoirs worldwide, detailed analysis of fractures and fracture networks is essential. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) however suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this paper, we therefore explore the use of an additional method - non-destructive 3D X-ray micro-Computed Tomography (μCT) - to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. We process the 3D μCT data in this study by a Hessian-based fracture filtering routine and can successfully extract porosity, fracture aperture, fracture density and fracture orientations - in bulk as well as locally. Additionally, thin sections made from selected plug samples provide 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) towards more realistic reservoir conditions. This study shows that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that although there are limitations, several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner.

  20. Integrated approach for quantification of fractured tight reservoir rocks: Porosity, permeability analyses and 3D fracture network characterisation on fractured dolomite samples

    NASA Astrophysics Data System (ADS)

    Voorn, Maarten; Barnhoorn, Auke; Exner, Ulrike; Baud, Patrick; Reuschlé, Thierry

    2015-04-01

    Fractured reservoir rocks make up an important part of the hydrocarbon reservoirs worldwide. A detailed analysis of fractures and fracture networks in reservoir rock samples is thus essential to determine the potential of these fractured reservoirs. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this study, we therefore explore the use of an additional method - non-destructive 3D X-ray micro-Computed Tomography (μCT) - to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna Basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. 3D μCT data is used to extract porosity, fracture aperture, fracture density and fracture orientations - in bulk as well as locally. The 3D analyses are complemented with thin sections made to provide some 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) of the µCT results towards more realistic reservoir conditions. Our results show that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner.

  1. Sampling for Patient Exit Interviews: Assessment of Methods Using Mathematical Derivation and Computer Simulations.

    PubMed

    Geldsetzer, Pascal; Fink, Günther; Vaikath, Maria; Bärnighausen, Till

    2018-02-01

    (1) To evaluate the operational efficiency of various sampling methods for patient exit interviews; (2) to discuss under what circumstances each method yields an unbiased sample; and (3) to propose a new, operationally efficient, and unbiased sampling method. Literature review, mathematical derivation, and Monte Carlo simulations. Our simulations show that in patient exit interviews it is most operationally efficient if the interviewer, after completing an interview, selects the next patient exiting the clinical consultation. We demonstrate mathematically that this method yields a biased sample: patients who spend a longer time with the clinician are overrepresented. This bias can be removed by selecting the next patient who enters, rather than exits, the consultation room. We show that this sampling method is operationally more efficient than alternative methods (systematic and simple random sampling) in most primary health care settings. Under the assumption that the order in which patients enter the consultation room is unrelated to the length of time spent with the clinician and the interviewer, selecting the next patient entering the consultation room tends to be the operationally most efficient unbiased sampling method for patient exit interviews. © 2016 The Authors. Health Services Research published by Wiley Periodicals, Inc. on behalf of Health Research and Educational Trust.

  2. Electroplating of nanostructured polyaniline-polypyrrole composite coating in a stainless-steel tube for on-line in-tube solid phase microextraction.

    PubMed

    Asiabi, Hamid; Yamini, Yadollah; Seidi, Shahram; Esrafili, Ali; Rezaei, Fatemeh

    2015-06-05

    In this work, a novel and efficient on-line in-tube solid phase microextraction method followed by high performance liquid chromatography was developed for preconcentration and determination of trace amounts of parabens. A nanostructured polyaniline-polypyrrole composite was electrochemically deposited on the inner surface of a stainless steel tube and used as the extraction phase. Several important factors that influence the extraction efficiency, including type of solid-phase coating, extraction and desorption times, flow rates of the sample solution and eluent, pH, and ionic strength of the sample solution were investigated and optimized. Under the optimal conditions, the limits of detection were in the range of 0.02-0.04 μg L(-1). This method showed good linearity for parabens in the range of 0.07-50 μg L(-1), with coefficients of determination better than 0.998. The intra- and inter-assay precisions (RSD%, n=3) were in the range of 5.9-7.0% and 4.4-5.7% at three concentration levels of 2, 10, and 20 μg L(-1), respectively. The extraction recovery values for the spiked samples were in the acceptable range of 80.3-90.2%. The validated method was successfully applied for analysis of methyl-, ethyl-, and propyl parabens in some water, milk, and juice samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. A hybrid LIBS-Raman system combined with chemometrics: an efficient tool for plastic identification and sorting.

    PubMed

    Shameem, K M Muhammed; Choudhari, Khoobaram S; Bankapur, Aseefhali; Kulkarni, Suresh D; Unnikrishnan, V K; George, Sajan D; Kartha, V B; Santhosh, C

    2017-05-01

    Classification of plastics is of great importance in the recycling industry as the littering of plastic wastes increases day by day as a result of its extensive use. In this paper, we demonstrate the efficacy of a combined laser-induced breakdown spectroscopy (LIBS)-Raman system for the rapid identification and classification of post-consumer plastics. The atomic information and molecular information of polyethylene terephthalate, polyethylene, polypropylene, and polystyrene were studied using plasma emission spectra and scattered signal obtained in the LIBS and Raman technique, respectively. The collected spectral features of the samples were analyzed using statistical tools (principal component analysis, Mahalanobis distance) to categorize the plastics. The analyses of the data clearly show that elemental information and molecular information obtained from these techniques are efficient for classification of plastics. In addition, the molecular information collected via Raman spectroscopy exhibits clearly distinct features for the transparent plastics (100% discrimination), whereas the LIBS technique shows better spectral feature differences for the colored samples. The study shows that the information obtained from these complementary techniques allows the complete classification of the plastic samples, irrespective of the color or additives. This work further throws some light on the fact that the potential limitations of any of these techniques for sample identification can be overcome by the complementarity of these two techniques. Graphical Abstract ᅟ.

  4. Efficacy of screens in removing long fibers from an aerosol stream – sample preparation technique for toxicology studies

    PubMed Central

    Ku, Bon Ki; Deye, Gregory J.; Turkevich, Leonid A.

    2015-01-01

    Fiber dimension (especially length) and biopersistence are thought to be important variables in determining the pathogenicity of asbestos and other elongate mineral particles. In order to prepare samples of fibers for toxicology studies, it is necessary to develop and evaluate methods for separating fibers by length in the micrometer size range. In this study, we have filtered an aerosol of fibers through nylon screens to investigate whether such screens can efficiently remove the long fibers (L >20 μm, a typical macrophage size) from the aerosol stream. Such a sample, deficient in long fibers, could then be used as the control in a toxicology study to investigate the role of length. A well-dispersed aerosol of glass fibers (a surrogate for asbestos) was generated by vortex shaking a Japan Fibrous Material Research Association (JFMRA) glass fiber powder. Fibers were collected on a mixed cellulose ester (MCE) filter, imaged with phase contrast microscopy (PCM) and lengths were measured. Length distributions of the fibers that penetrated through various screens (10, 20 and 60 μm mesh sizes) were analyzed; additional study was made of fibers that penetrated through double screen and centrally blocked screen configurations. Single screens were not particularly efficient in removing the long fibers; however, the alternative configurations, especially the centrally blocked screen configuration, yielded samples substantially free of the long fibers. PMID:24417374

  5. A modified indirect mathematical model for evaluation of ethanol production efficiency in industrial-scale continuous fermentation processes.

    PubMed

    Canseco Grellet, M A; Castagnaro, A; Dantur, K I; De Boeck, G; Ahmed, P M; Cárdenas, G J; Welin, B; Ruiz, R M

    2016-10-01

    To calculate fermentation efficiency in a continuous ethanol production process, we aimed to develop a robust mathematical method based on the analysis of metabolic by-product formation. This method is in contrast to the traditional way of calculating ethanol fermentation efficiency, where the ratio between the ethanol produced and the sugar consumed is expressed as a percentage of the theoretical conversion yield. Comparison between the two methods, at industrial scale and in sensitivity studies, showed that the indirect method was more robust and gave slightly higher fermentation efficiency values, although fermentation efficiency of the industrial process was found to be low (~75%). The traditional calculation method is simpler than the indirect method as it only requires a few chemical determinations in samples collected. However, a minor error in any measured parameter will have an important impact on the calculated efficiency. In contrast, the indirect method of calculation requires a greater number of determinations but is much more robust since an error in any parameter will only have a minor effect on the fermentation efficiency value. The application of the indirect calculation methodology in order to evaluate the real situation of the process and to reach an optimum fermentation yield for an industrial-scale ethanol production is recommended. Once a high fermentation yield has been reached the traditional method should be used to maintain the control of the process. Upon detection of lower yields in an optimized process the indirect method should be employed as it permits a more accurate diagnosis of causes of yield losses in order to correct the problem rapidly. The low fermentation efficiency obtained in this study shows an urgent need for industrial process optimization where the indirect calculation methodology will be an important tool to determine process losses. © 2016 The Society for Applied Microbiology.

  6. Population frequencies of the Triallelic 5HTTLPR in six Ethnicially diverse samples from North America, Southeast Asia, and Africa.

    PubMed

    Haberstick, Brett C; Smolen, Andrew; Williams, Redford B; Bishop, George D; Foshee, Vangie A; Thornberry, Terence P; Conger, Rand; Siegler, Ilene C; Zhang, Xiaodong; Boardman, Jason D; Frajzyngier, Zygmunt; Stallings, Michael C; Brent Donnellan, M; Halpern, Carolyn T; Harris, Kathleen Mullan

    2015-03-01

    Genetic differences between populations are potentially an important contributor to health disparities around the globe. As differences in gene frequencies influence study design, it is important to have a thorough understanding of the natural variation of the genetic variant(s) of interest. Along these lines, we characterized the variation of the 5HTTLPR and rs25531 polymorphisms in six samples from North America, Southeast Asia, and Africa (Cameroon) that differ in their racial and ethnic composition. Allele and genotype frequencies were determined for 24,066 participants. Results indicated higher frequencies of the rs25531 G-allele among Black and African populations as compared with White, Hispanic and Asian populations. Further, we observed a greater number of 'extra-long' ('XL') 5HTTLPR alleles than have previously been reported. Extra-long alleles occurred almost entirely among Asian, Black and Non-White Hispanic populations as compared with White and Native American populations where they were completely absent. Lastly, when considered jointly, we observed between sample differences in the genotype frequencies within racial and ethnic populations. Taken together, these data underscore the importance of characterizing the L-G allele to avoid misclassification of participants by genotype and for further studies of the impact XL alleles may have on the transcriptional efficiency of SLC6A4.

  7. An efficient sampling approach for variance-based sensitivity analysis based on the law of total variance in the successive intervals without overlapping

    NASA Astrophysics Data System (ADS)

    Yun, Wanying; Lu, Zhenzhou; Jiang, Xian

    2018-06-01

    To efficiently execute the variance-based global sensitivity analysis, the law of total variance in the successive intervals without overlapping is proved at first, on which an efficient space-partition sampling-based approach is subsequently proposed in this paper. Through partitioning the sample points of output into different subsets according to different inputs, the proposed approach can efficiently evaluate all the main effects concurrently by one group of sample points. In addition, there is no need for optimizing the partition scheme in the proposed approach. The maximum length of subintervals is decreased by increasing the number of sample points of model input variables in the proposed approach, which guarantees the convergence condition of the space-partition approach well. Furthermore, a new interpretation on the thought of partition is illuminated from the perspective of the variance ratio function. Finally, three test examples and one engineering application are employed to demonstrate the accuracy, efficiency and robustness of the proposed approach.

  8. Improving clinical laboratory efficiency: a time-motion evaluation of the Abbott m2000 RealTime and Roche COBAS AmpliPrep/COBAS TaqMan PCR systems for the simultaneous quantitation of HIV-1 RNA and HCV RNA.

    PubMed

    Amendola, Alessandra; Coen, Sabrina; Belladonna, Stefano; Pulvirenti, F Renato; Clemens, John M; Capobianchi, M Rosaria

    2011-08-01

    Diagnostic laboratories need automation that facilitates efficient processing and workflow management to meet today's challenges for expanding services and reducing cost, yet maintaining the highest levels of quality. Processing efficiency of two commercially available automated systems for quantifying HIV-1 and HCV RNA, Abbott m2000 system and Roche COBAS Ampliprep/COBAS TaqMan 96 (docked) systems (CAP/CTM), was evaluated in a mid/high throughput workflow laboratory using a representative daily workload of 24 HCV and 72 HIV samples. Three test scenarios were evaluated: A) one run with four batches on the CAP/CTM system, B) two runs on the Abbott m2000 and C) one run using the Abbott m2000 maxCycle feature (maxCycle) for co-processing these assays. Cycle times for processing, throughput and hands-on time were evaluated. Overall processing cycle time was 10.3, 9.1 and 7.6 h for Scenarios A), B) and C), respectively. Total hands-on time for each scenario was, in order, 100.0 (A), 90.3 (B) and 61.4 min (C). The interface of an automated analyzer to the laboratory workflow, notably system set up for samples and reagents and clean up functions, are as important as the automation capability of the analyzer for the overall impact to processing efficiency and operator hands-on time.

  9. How long is enough to detect terrestrial animals? Estimating the minimum trapping effort on camera traps

    PubMed Central

    Si, Xingfeng; Kays, Roland

    2014-01-01

    Camera traps is an important wildlife inventory tool for estimating species diversity at a site. Knowing what minimum trapping effort is needed to detect target species is also important to designing efficient studies, considering both the number of camera locations, and survey length. Here, we take advantage of a two-year camera trapping dataset from a small (24-ha) study plot in Gutianshan National Nature Reserve, eastern China to estimate the minimum trapping effort actually needed to sample the wildlife community. We also evaluated the relative value of adding new camera sites or running cameras for a longer period at one site. The full dataset includes 1727 independent photographs captured during 13,824 camera days, documenting 10 resident terrestrial species of birds and mammals. Our rarefaction analysis shows that a minimum of 931 camera days would be needed to detect the resident species sufficiently in the plot, and c. 8700 camera days to detect all 10 resident species. In terms of detecting a diversity of species, the optimal sampling period for one camera site was c. 40, or long enough to record about 20 independent photographs. Our analysis of evaluating the increasing number of additional camera sites shows that rotating cameras to new sites would be more efficient for measuring species richness than leaving cameras at fewer sites for a longer period. PMID:24868493

  10. Quantitation of next generation sequencing library preparation protocol efficiencies using droplet digital PCR assays - a systematic comparison of DNA library preparation kits for Illumina sequencing.

    PubMed

    Aigrain, Louise; Gu, Yong; Quail, Michael A

    2016-06-13

    The emergence of next-generation sequencing (NGS) technologies in the past decade has allowed the democratization of DNA sequencing both in terms of price per sequenced bases and ease to produce DNA libraries. When it comes to preparing DNA sequencing libraries for Illumina, the current market leader, a plethora of kits are available and it can be difficult for the users to determine which kit is the most appropriate and efficient for their applications; the main concerns being not only cost but also minimal bias, yield and time efficiency. We compared 9 commercially available library preparation kits in a systematic manner using the same DNA sample by probing the amount of DNA remaining after each protocol steps using a new droplet digital PCR (ddPCR) assay. This method allows the precise quantification of fragments bearing either adaptors or P5/P7 sequences on both ends just after ligation or PCR enrichment. We also investigated the potential influence of DNA input and DNA fragment size on the final library preparation efficiency. The overall library preparations efficiencies of the libraries show important variations between the different kits with the ones combining several steps into a single one exhibiting some final yields 4 to 7 times higher than the other kits. Detailed ddPCR data also reveal that the adaptor ligation yield itself varies by more than a factor of 10 between kits, certain ligation efficiencies being so low that it could impair the original library complexity and impoverish the sequencing results. When a PCR enrichment step is necessary, lower adaptor-ligated DNA inputs leads to greater amplification yields, hiding the latent disparity between kits. We describe a ddPCR assay that allows us to probe the efficiency of the most critical step in the library preparation, ligation, and to draw conclusion on which kits is more likely to preserve the sample heterogeneity and reduce the need of amplification.

  11. Histological and Thermometric Examination of Soft Tissue De-Epithelialization Using Digitally Controlled Er:YAG Laser Handpiece: An Ex Vivo Study.

    PubMed

    Grzech-Leśniak, Kinga; Matys, Jacek; Jurczyszyn, Kamil; Ziółkowski, Piotr; Dominiak, Marzena; Brugnera Junior, Aldo; Romeo, Umberto

    2018-06-01

    The purpose of this study was histological and thermometric examination of soft tissue de-epithelialization using digitally controlled laser handpiece (DCLH) - X-Runner. Commonly used techniques for de-epithelialization include scalpel, abrasion with diamond bur, or a combination of the two. Despite being simple, inexpensive and effective, these techniques are invasive and may produce unwanted side effects. It is important to look for alternative techniques using novel tools, which are minimally invasive and effective. 114 porcine samples sized 6 × 6 mm were collected from the attached gingiva (AG) of the alveolar process of the mandible using 15C scalpel blade. The samples were irradiated by means of Er:YAG laser (LightWalker, Fotona, Slovenia), using X-Runner and HO 2 handpieces at different parameters; 80, 100, and 140 mJ/20 Hz in time of 6 or 16 sec, respectively. The temperature was measured with a K-type thermocouple. For the histopathological analysis of efficiency of epithelium removal and thermal injury, 3 random samples were de-epithelialized with an HO 2 handpiece, and 9 random samples with an X-Runner handpiece with different parameters. For the samples irradiated with DCLH, we have used three different settings, which resulted in removing 1 to 3 layers of the soft tissue. The efficiency of epithelium removal and the rise of temperature were analyzed. DCLH has induced significantly lower temperature increase compared with HO 2 at each energy to frequency ratio. The histological examination revealed total epithelium removal when HO 2 handpiece was used at 100 and 140 mJ/20 Hz and when DCLH was used for two- and threefold lasing at 80, 100, and 140 mJ/20 Hz. Er:YAG laser with DCLH handpiece may be an efficient tool in epithelium removal without excessive thermal damage.

  12. Fast, sensitive and reliable multi-residue method for routine determination of 34 pesticides from various chemical groups in water samples by using dispersive liquid-liquid microextraction coupled with gas chromatography-mass spectrometry.

    PubMed

    Tankiewicz, Maciej; Biziuk, Marek

    2018-02-01

    A simple and efficient dispersive liquid-liquid microextraction technique (DLLME) was developed by using a mixture of two solvents: 40 μL of tetrachlorethylene (extraction solvent) and 1.0 mL of methanol (disperser solvent), which was rapidly injected with a syringe into 10 mL of water sample. Some important parameters affecting the extraction efficiency, such as type and volume of solvents, water sample volume, extraction time, temperature, pH adjustment and salt addition effect were investigated. Simultaneous determination of 34 commonly used pesticides was performed by using gas chromatography coupled with mass spectrometry (GC-MS). The procedure has been validated in order to obtain the highest efficiency at the lowest concentration levels of analytes to fulfill the requirements of regulations on maximum residue limits. Under the optimum conditions, the linearity range was within 0.0096-100 μg L -1 . The limits of detection (LODs) of the developed DLLME-GC-MS methodology for all investigated pesticides were in the range of 0.0032 (endrin)-0.0174 (diazinon) μg L -1 and limits of quantification (LOQs) from 0.0096 to 0.052 μg L -1 . At lower concentration of 1 μg L -1 for each pesticide, recoveries ranged between 84% (tebufenpyrad) and 108% (deltamethrin) with relative standard deviations (RSDs) (n = 7) from 1.1% (metconazole) to 11% (parathion-mehtyl). This methodology was successfully applied to check contamination of environmental samples. The procedure has proved to be selective, sensitive and precise for the simultaneous determination of various pesticides. The optimized analytical method is very simple and rapid (less than 5 min). Graphical abstract Analytical procedure for testing water samples consists of dispersive liquid-liquid microextraction (DLLME) and gas chromatography coupled with mass spectrometry (GC-MS).

  13. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  14. SU-E-I-79: Source Geometry Dependence of Gamma Well-Counter Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, M; Belanger, A; Kijewski, M

    Purpose: To determine the effect of liquid sample volume and geometry on counting efficiency in a gamma well-counter, and to assess the relative contributions of sample geometry and self-attenuation. Gamma wellcounters are standard equipment in clinical and preclinical studies, for measuring patient blood radioactivity and quantifying animal tissue uptake for tracer development and other purposes. Accurate measurements are crucial. Methods: Count rates were measured for aqueous solutions of 99m- Tc at four liquid volume values in a 1-cm-diam tube and at six volume values in a 2.2-cm-diam vial. Total activity was constant for all volumes, and data were corrected formore » decay. Count rates from a point source in air, supported by a filter paper, were measured at seven heights between 1.3 and 5.7 cm from the bottom of a tube. Results: Sample volume effects were larger for the tube than for the vial. For the tube, count efficiency relative to a 1-cc volume ranged from 1.05 at 0.05 cc to 0.84 at 3 cc. For the vial, relative count efficiency ranged from 1.02 at 0.05 cc to 0.87 at 15 cc. For the point source, count efficiency relative to 1.3 cm from the tube bottom ranged from 0.98 at 1.8 cm to 0.34 at 5.7 cm. The relative efficiency of a 3-cc liquid sample in a tube compared to a 1-cc sample is 0.84; the average relative efficiency for the solid sample in air between heights in the tube corresponding to the surfaces of those volumes (1.3 and 4.8 cm) is 0.81, implying that the major contribution to efficiency loss is geometry, rather than attenuation. Conclusion: Volume-dependent correction factors should be used for accurate quantitation radioactive of liquid samples. Solid samples should be positioned at the bottom of the tube for maximum count efficiency.« less

  15. Universal nucleic acids sample preparation method for cells, spores and their mixture

    DOEpatents

    Bavykin, Sergei [Darien, IL

    2011-01-18

    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  16. A new low-cost procedure for detecting nucleic acids in low-incidence samples: a case study of detecting spores of Paenibacillus larvae from bee debris.

    PubMed

    Ryba, Stepan; Kindlmann, Pavel; Titera, Dalibor; Haklova, Marcela; Stopka, Pavel

    2012-10-01

    American foulbrood, because of its virulence and worldwide spread, is currently one of the most dangerous diseases of honey bees. Quick diagnosis of this disease is therefore vitally important. For its successful eradication, however, all the hives in the region must be tested. This is time consuming and costly. Therefore, a fast and sensitive method of detecting American foulbrood is needed. Here we present a method that significantly reduces the number of tests needed by combining batches of samples from different hives. The results of this method were verified by testing each sample. A simulation study was used to compare the efficiency of the new method with testing all the samples and to develop a decision tool for determining when best to use the new method. The method is suitable for testing large numbers of samples (over 100) when the incidence of the disease is low (10% or less).

  17. Dairy farm cost efficiency in leading milk-producing regions in Poland.

    PubMed

    Sobczyński, T; Klepacka, A M; Revoredo-Giha, C; Florkowski, W J

    2015-12-01

    This paper examines the cost efficiency of dairy farms in 2 important regions of commercial milk production in Poland (i.e., Wielkopolskie and Podlaskie). Both regions gained importance following the market-driven resource allocation mechanism adopted after Poland's transition to the market economy in 1989 and accession to the European Union (EU) in 2004. The elimination of the dairy quota system in the EU in 2015 offers new expansion opportunities. The analysis of trends in cow numbers, milk production, and yield per cow shows different patterns of expansion of the dairy sector in the 2 regions. We selected dairy farm data from the Farm Accounts Data Network database for both regions and applied the cost frontier estimation model to calculate the relative cost-efficiency index for the period 2004 to 2009. The indexes compare each farm in the sample to the most efficient dairy farm in each region separately. Additionally, the top 5% of dairy farms with the highest relative cost efficiency index from each region were compared in terms of production costs with published results from a study using the representative farm approach. The comparison of results from 2 different studies permits a conclusion that Wielkopolskie and Podlaskie dairy farms are able to compete with farms from the 4 largest milk-producing countries in the EU. Although both regions can improve yields per cow, especially Podlaskie, both regions are likely to take advantage of the expansion opportunities offered by the 2015 termination of the milk quota system. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. On the enhanced sampling over energy barriers in molecular dynamics simulations.

    PubMed

    Gao, Yi Qin; Yang, Lijiang

    2006-09-21

    We present here calculations of free energies of multidimensional systems using an efficient sampling method. The method uses a transformed potential energy surface, which allows an efficient sampling of both low and high energy spaces and accelerates transitions over barriers. It allows efficient sampling of the configuration space over and only over the desired energy range(s). It does not require predetermined or selected reaction coordinate(s). We apply this method to study the dynamics of slow barrier crossing processes in a disaccharide and a dipeptide system.

  19. Random Numbers and Monte Carlo Methods

    NASA Astrophysics Data System (ADS)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  20. Sampling error in timber surveys

    Treesearch

    Austin Hasel

    1938-01-01

    Various sampling strategies are evaluated for efficiency in an interior ponderosa pine forest. In a 5760 acre tract, efficiency was gained by stratifying into quarter acre blocks and sampling randomly from within. A systematic cruise was found to be superior for volume estimation.

  1. Optimal approaches for inline sampling of organisms in ballast water: L-shaped vs. Straight sample probes

    NASA Astrophysics Data System (ADS)

    Wier, Timothy P.; Moser, Cameron S.; Grant, Jonathan F.; Riley, Scott C.; Robbins-Wamsley, Stephanie H.; First, Matthew R.; Drake, Lisa A.

    2017-10-01

    Both L-shaped ("L") and straight ("Straight") sample probes have been used to collect water samples from a main ballast line in land-based or shipboard verification testing of ballast water management systems (BWMS). A series of experiments was conducted to quantify and compare the sampling efficiencies of L and Straight sample probes. The findings from this research-that both L and Straight probes sample organisms with similar efficiencies-permit increased flexibility for positioning sample probes aboard ships.

  2. Application of calculated NMR parameters, aromaticity indices and wavefunction properties for evaluation of corrosion inhibition efficiency of pyrazine inhibitors

    NASA Astrophysics Data System (ADS)

    Behzadi, Hadi; Manzetti, Sergio; Dargahi, Maryam; Roonasi, Payman; Khalilnia, Zahra

    2018-01-01

    In light of the importance of developing novel corrosion inhibitors, a series of quantum chemical calculations were carried out to evaluate 15N chemical shielding CS tensors as well as aromaticity indexes including NICS, HOMA, FLU, and PDI of three pyrazine derivatives, 2-methylpyrazine (MP), 2-aminopyrazine (AP) and 2-amino-5-bromopyrazine (ABP). The NICS parameters have been shown in previous studies to be paramount to the prediction of anti-corrosion properties, and have been combined here with HOMA, FLU and PDI and detailed wavefunction analysis to determine the effects from bromination and methylation on pyrazine. The results show that the electron density around the nitrogens, represented by CS tensors, can be good indicators of anti-corrosion efficiency. Additionally, the NICS, FLU and PDI, as aromaticity indicators of molecule, are well correlated with experimental corrosion inhibition efficiencies of the studied inhibitors. Bader sampling and detailed wavefunction analysis shows that the major effects from bromination on the pyrazine derivatives affect the Laplacian of the electron density of the ring, delocalizing the aromatic electrons of the carbon atoms into lone pairs and increasing polarization of the Laplacian values. This feature is well agreement with empirical studies, which show that ABP is the most efficient anti-corrosion compound followed by AP and MP, a property which can be attributed and predicted by derivation of the Laplacian of the electron density of the ring nuclei. This study shows the importance of devising DFT methods for development of new corrosion inhibitors, and the strength of electronic and nuclear analysis, and depicts most importantly how corrosion inhibitors composed of aromatic moieties may be modified to increase anti-corrosive properties.

  3. Effects of smectite on the oil-expulsion efficiency of the Kreyenhagen Shale, San Joaquin Basin, California, based on hydrous-pyrolysis experiments

    USGS Publications Warehouse

    Lewan, Michael D.; Dolan, Michael P.; Curtis, John B.

    2014-01-01

    The amount of oil that maturing source rocks expel is expressed as their expulsion efficiency, which is usually stated in milligrams of expelled oil per gram of original total organic carbon (TOCO). Oil-expulsion efficiency can be determined by heating thermally immature source rocks in the presence of liquid water (i.e., hydrous pyrolysis) at temperatures between 350°C and 365°C for 72 hr. This pyrolysis method generates oil that is compositionally similar to natural crude oil and expels it by processes operative in the subsurface. Consequently, hydrous pyrolysis provides a means to determine oil-expulsion efficiencies and the rock properties that influence them. Smectite in source rocks has previously been considered to promote oil generation and expulsion and is the focus of this hydrous-pyrolysis study involving a representative sample of smectite-rich source rock from the Eocene Kreyenhagen Shale in the San Joaquin Basin of California. Smectite is the major clay mineral (31 wt. %) in this thermally immature sample, which contains 9.4 wt. % total organic carbon (TOC) comprised of type II kerogen. Compared to other immature source rocks that lack smectite as their major clay mineral, the expulsion efficiency of the Kreyenhagen Shale was significantly lower. The expulsion efficiency of the Kreyenhagen whole rock was reduced 88% compared to that of its isolated kerogen. This significant reduction is attributed to bitumen impregnating the smectite interlayers in addition to the rock matrix. Within the interlayers, much of the bitumen is converted to pyrobitumen through crosslinking instead of oil through thermal cracking. As a result, smectite does not promote oil generation but inhibits it. Bitumen impregnation of the rock matrix and smectite interlayers results in the rock pore system changing from water wet to bitumen wet. This change prevents potassium ion (K+) transfer and dissolution and precipitation reactions needed for the conversion of smectite to illite. As a result, illitization only reaches 35% to 40% at 310°C for 72 hr and remains unchanged to 365°C for 72 hr. Bitumen generation before or during early illitization in these experiments emphasizes the importance of knowing when and to what degree illitization occurs in natural maturation of a smectite-rich source rock to determine its expulsion efficiency. Complete illitization prior to bitumen generation is common for Paleozoic source rocks (e.g., Woodford Shale and Retort Phosphatic Shale Member of the Phosphoria Formation), and expulsion efficiencies can be determined on immature samples by hydrous pyrolysis. Conversely, smectite is more common in Cenozoic source rocks like the Kreyenhagen Shale, and expulsion efficiencies determined by hydrous pyrolysis need to be made on samples that reflect the level of illitization at or near bitumen generation in the subsurface.

  4. Bee species diversity enhances productivity and stability in a perennial crop.

    PubMed

    Rogers, Shelley R; Tarpy, David R; Burrack, Hannah J

    2014-01-01

    Wild bees provide important pollination services to agroecoystems, but the mechanisms which underlie their contribution to ecosystem functioning--and, therefore, their importance in maintaining and enhancing these services-remain unclear. We evaluated several mechanisms through which wild bees contribute to crop productivity, the stability of pollinator visitation, and the efficiency of individual pollinators in a highly bee-pollination dependent plant, highbush blueberry. We surveyed the bee community (through transect sampling and pan trapping) and measured pollination of both open- and singly-visited flowers. We found that the abundance of managed honey bees, Apis mellifera, and wild-bee richness were equally important in describing resulting open pollination. Wild-bee richness was a better predictor of pollination than wild-bee abundance. We also found evidence suggesting pollinator visitation (and subsequent pollination) are stabilized through the differential response of bee taxa to weather (i.e., response diversity). Variation in the individual visit efficiency of A. mellifera and the southeastern blueberry bee, Habropoda laboriosa, a wild specialist, was not associated with changes in the pollinator community. Our findings add to a growing literature that diverse pollinator communities provide more stable and productive ecosystem services.

  5. Bee Species Diversity Enhances Productivity and Stability in a Perennial Crop

    PubMed Central

    Rogers, Shelley R.; Tarpy, David R.; Burrack, Hannah J.

    2014-01-01

    Wild bees provide important pollination services to agroecoystems, but the mechanisms which underlie their contribution to ecosystem functioning—and, therefore, their importance in maintaining and enhancing these services—remain unclear. We evaluated several mechanisms through which wild bees contribute to crop productivity, the stability of pollinator visitation, and the efficiency of individual pollinators in a highly bee-pollination dependent plant, highbush blueberry. We surveyed the bee community (through transect sampling and pan trapping) and measured pollination of both open- and singly-visited flowers. We found that the abundance of managed honey bees, Apis mellifera, and wild-bee richness were equally important in describing resulting open pollination. Wild-bee richness was a better predictor of pollination than wild-bee abundance. We also found evidence suggesting pollinator visitation (and subsequent pollination) are stabilized through the differential response of bee taxa to weather (i.e., response diversity). Variation in the individual visit efficiency of A. mellifera and the southeastern blueberry bee, Habropoda laboriosa, a wild specialist, was not associated with changes in the pollinator community. Our findings add to a growing literature that diverse pollinator communities provide more stable and productive ecosystem services. PMID:24817218

  6. Release modeling and comparison of nanoarchaeosomal, nanoliposomal and pegylated nanoliposomal carriers for paclitaxel.

    PubMed

    Movahedi, Fatemeh; Ebrahimi Shahmabadi, Hasan; Alavi, Seyed Ebrahim; Koohi Moftakhari Esfahani, Maedeh

    2014-09-01

    Breast cancer is the most prevalent cancer among women. Recently, delivering by nanocarriers has resulted in a remarkable evolution in treatment of numerous cancers. Lipid nanocarriers are important ones while liposomes and archaeosomes are common lipid nanocarriers. In this work, paclitaxel was used and characterized in nanoliposomal and nanoarchaeosomal form to improve efficiency. To increase stability, efficiency and solubility, polyethylene glycol 2000 (PEG 2000) was added to some samples. MTT assay confirmed effectiveness of nanocarriers on MCF-7 cell line and size measuring validated nano-scale of particles. Nanoarchaeosomal carriers demonstrated highest encapsulation efficiency and lowest release rate. On the other hand, pegylated nanoliposomal carrier showed higher loading efficiency and less release compared with nanoliposomal carrier which verifies effect of PEG on improvement of stability and efficiency. Additionally, release pattern was modeled using artificial neural network (ANN) and genetic algorithm (GA). Using ANN modeling for release prediction, resulted in R values of 0.976, 0.989 and 0.999 for nanoliposomal, pegylated nanoliposomal and nanoarchaeosomal paclitaxel and GA modeling led to values of 0.954, 0.951 and 0.976, respectively. ANN modeling was more successful in predicting release compared with the GA strategy.

  7. Are participants in markets for water rights more efficient in the use of water than non-participants? A case study for Limarí Valley (Chile).

    PubMed

    Molinos-Senante, María; Donoso, Guillermo; Sala-Garrido, Ramon

    2016-06-01

    The need to increase water productivity in agriculture has been stressed as one of the most important factors to achieve greater agricultural productivity and sustainability. The main aim of this paper is to investigate whether there are differences in water use efficiency (WUE) between farmers who participate in water markets and farmers who do not participate in them. Moreover, the use of a non-radial data envelopment analysis model allows to compute global efficiency (GE), WUE as well the efficiency in the use of other inputs such as fertilizers, pesticides, energy, and labor. In a second stage, external factors that may affect GE and WUE are explored. The empirical application focuses on a sample of farmers located in Limarí Valley (Chile) where regulated permanent water rights (WR) markets for surface water have a long tradition. Results illustrate that WR sellers are the most efficient in the use of water while non-traders are the farmers that present the lowest WUE. From a policy perspective, significant conclusions are drawn from the assessment of agricultural water productivity in the framework of water markets.

  8. Evaluation of Surface Sampling for Bacillus Spores Using ...

    EPA Pesticide Factsheets

    Journal Article In this study, commercially-available domestic cleaning robots were evaluated for spore surface sampling efficiency on common indoor surfaces. The current study determined the sampling efficiency of each robot, without modifying the sensors, algorithms, or logics set by the manufacturers.

  9. Enhanced Third-Order Optical Nonlinearity Driven by Surface-Plasmon Field Gradients.

    PubMed

    Kravtsov, Vasily; AlMutairi, Sultan; Ulbricht, Ronald; Kutayiah, A Ryan; Belyanin, Alexey; Raschke, Markus B

    2018-05-18

    Efficient nonlinear optical frequency mixing in small volumes is key for future on-chip photonic devices. However, the generally low conversion efficiency severely limits miniaturization to nanoscale dimensions. Here we demonstrate that gradient-field effects can provide for an efficient, conventionally dipole-forbidden nonlinear response. We show that a longitudinal nonlinear source current can dominate the third-order optical nonlinearity of the free electron response in gold in the technologically important near-IR frequency range where the nonlinearities due to other mechanisms are particularly small. Using adiabatic nanofocusing to spatially confine the excitation fields, from measurements of the 2ω_{1}-ω_{2} four-wave mixing response as a function of detuning ω_{1}-ω_{2}, we find up to 10^{-5} conversion efficiency with a gradient-field contribution to χ_{Au}^{(3)} of up to 10^{-19}  m^{2}/V^{2}. The results are in good agreement with the theory based on plasma hydrodynamics and underlying electron dynamics. The associated increase in the nonlinear conversion efficiency with a decreasing sample size, which can even overcompensate the volume decrease, offers a new approach for enhanced nonlinear nano-optics. This will enable more efficient nonlinear optical devices and the extension of coherent multidimensional spectroscopies to the nanoscale.

  10. A stochastic frontier analysis of technical efficiency of fish cage culture in Peninsular Malaysia.

    PubMed

    Islam, Gazi Md Nurul; Tai, Shzee Yew; Kusairi, Mohd Noh

    2016-01-01

    Cage culture plays an important role in achieving higher output and generating more export earnings in Malaysia. However, the cost of fingerlings, feed and labour have increased substantially for cage culture in the coastal areas in Peninsular Malaysia. This paper uses farm level data gathered from Manjung, Perak and Kota Tinggi, Johor to investigate the technical efficiency of brackish water fish cage culture using the stochastic frontier approach. The technical efficiency was estimated and specifically the factors affecting technical inefficiencies of fish cage culture system in Malaysia was investigated. On average, 37 percent of the sampled fish cage farms are technically efficient. The results suggest very high degrees of technical inefficiency exist among the cage culturists. This implies that great potential exists to increase fish production through improved efficiency in cage culture management in Peninsular Malaysia. The results indicate that farmers obtained grouper fingerlings from other neighboring countries due to scarcity of fingerlings from wild sources. The cost of feeding for grouper (Epinephelus fuscoguttatus) requires relatively higher costs compared to seabass (Lates calcarifer) production in cage farms in the study areas. Initiatives to undertake extension programmes at the farm level are needed to help cage culturists in utilizing their resources more efficiently in order to substantially enhance their fish production.

  11. Enhanced Third-Order Optical Nonlinearity Driven by Surface-Plasmon Field Gradients

    NASA Astrophysics Data System (ADS)

    Kravtsov, Vasily; AlMutairi, Sultan; Ulbricht, Ronald; Kutayiah, A. Ryan; Belyanin, Alexey; Raschke, Markus B.

    2018-05-01

    Efficient nonlinear optical frequency mixing in small volumes is key for future on-chip photonic devices. However, the generally low conversion efficiency severely limits miniaturization to nanoscale dimensions. Here we demonstrate that gradient-field effects can provide for an efficient, conventionally dipole-forbidden nonlinear response. We show that a longitudinal nonlinear source current can dominate the third-order optical nonlinearity of the free electron response in gold in the technologically important near-IR frequency range where the nonlinearities due to other mechanisms are particularly small. Using adiabatic nanofocusing to spatially confine the excitation fields, from measurements of the 2 ω1-ω2 four-wave mixing response as a function of detuning ω1-ω2, we find up to 10-5 conversion efficiency with a gradient-field contribution to χAu(3 ) of up to 10-19 m2/V2 . The results are in good agreement with the theory based on plasma hydrodynamics and underlying electron dynamics. The associated increase in the nonlinear conversion efficiency with a decreasing sample size, which can even overcompensate the volume decrease, offers a new approach for enhanced nonlinear nano-optics. This will enable more efficient nonlinear optical devices and the extension of coherent multidimensional spectroscopies to the nanoscale.

  12. Estimating technical efficiency in the hospital sector with panel data: a comparison of parametric and non-parametric techniques.

    PubMed

    Siciliani, Luigi

    2006-01-01

    Policy makers are increasingly interested in developing performance indicators that measure hospital efficiency. These indicators may give the purchasers of health services an additional regulatory tool to contain health expenditure. Using panel data, this study compares different parametric (econometric) and non-parametric (linear programming) techniques for the measurement of a hospital's technical efficiency. This comparison was made using a sample of 17 Italian hospitals in the years 1996-9. Highest correlations are found in the efficiency scores between the non-parametric data envelopment analysis under the constant returns to scale assumption (DEA-CRS) and several parametric models. Correlation reduces markedly when using more flexible non-parametric specifications such as data envelopment analysis under the variable returns to scale assumption (DEA-VRS) and the free disposal hull (FDH) model. Correlation also generally reduces when moving from one output to two-output specifications. This analysis suggests that there is scope for developing performance indicators at hospital level using panel data, but it is important that extensive sensitivity analysis is carried out if purchasers wish to make use of these indicators in practice.

  13. Shape Engineering Boosts Magnetic Mesoporous Silica Nanoparticle-Based Isolation and Detection of Circulating Tumor Cells.

    PubMed

    Chang, Zhi-Min; Wang, Zheng; Shao, Dan; Yue, Juan; Xing, Hao; Li, Li; Ge, Mingfeng; Li, Mingqiang; Yan, Huize; Hu, Hanze; Xu, Qiaobing; Dong, Wen-Fei

    2018-04-04

    Magnetic mesoporous silica nanoparticles (M-MSNs) are attractive candidates for the immunomagnetic isolation and detection of circulating tumor cells (CTCs). Understanding of the interactions between the effects of the shape of M-MSNs and CTCs is crucial to maximize the binding capacity and capture efficiency as well as to facilitate the sensitivity and efficiency of detection. In this work, fluorescent M-MSNs were rationally designed with sphere and rod morphologies while retaining their robust fluorescence and uniform surface functionality. After conjugation with the antibody of epithelial cell adhesion molecule (EpCAM), both of the differently shaped M-MSNs-EpCAM obtained achieved efficient enrichment of CTCs and fluorescent-based detection. Importantly, rodlike M-MSNs exhibited faster immunomagnetic isolation as well as better performance in the isolation and detection of CTCs in spiked cells and real clinical blood samples than those of their spherelike counterparts. Our results showed that shape engineering contributes positively toward immunomagnetic isolation, which might open new avenues to the rational design of magnetic-fluorescent nanoprobes for the sensitive and efficient isolation and detection of CTCs.

  14. Parking Lot Runoff Quality and Treatment Efficiency of a Stormwater-Filtration Device, Madison, Wisconsin, 2005-07

    USGS Publications Warehouse

    Horwatich, Judy A.; Bannerman, Roger T.

    2010-01-01

    To evaluate the treatment efficiency of a stormwater-filtration device (SFD) for potential use at Wisconsin Department of Transportation (WisDOT) park-and-ride facilities, a SFD was installed at an employee parking lot in downtown Madison, Wisconsin. This type of parking lot was chosen for the test site because the constituent concentrations and particle-size distributions (PSDs) were expected to be similar to those of a typical park-and-ride lot operated by WisDOT. The objective of this particular installation was to reduce loads of total suspended solids (TSS) in stormwater runoff to Lake Monona. This study also was designed to provide a range of treatment efficiencies expected for a SFD. Samples from the inlet and outlet were analyzed for 33 organic and inorganic constituents, including 18 polycyclic aromatic hydrocarbons (PAHs). Samples were also analyzed for physical properties, including PSD. Water-quality samples were collected for 51 runoff events from November 2005 to August 2007. Samples from all runoff events were analyzed for concentrations of suspended sediment (SS). Samples from 31 runoff events were analyzed for 15 constituents, samples from 15 runoff events were analyzed for PAHs, and samples from 36 events were analyzed for PSD. The treatment efficiency of the SFD was calculated using the summation of loads (SOL) and the efficiency ratio methods. Constituents for which the concentrations and (or) loads were decreased by the SFD include TSS, SS, volatile suspended solids, total phosphorous (TP), total copper, total zinc, and PAHs. The efficiency ratios for these constituents are 45, 37, 38, 55, 22, 5, and 46 percent, respectively. The SOLs for these constituents are 32, 37, 28, 36, 23, 8, and 48 percent, respectively. The SOL for chloride was -21 and the efficiency ratio was -18. Six chemical constituents or properties-dissolved phosphorus, chemical oxygen demand, dissolved zinc, total dissolved solids, dissolved chemical oxygen demand, and dissolved copper-were not included in the efficiency or SOL, because the difference between concentrations in samples from the inlet and outlet were not significant. Concentrations of TP and TSS were inexplicably high in samples at the inlet for one event.

  15. In vivo biotinylation and incorporation of a photo-inducible unnatural amino acid to an antibody-binding domain improve site-specific labeling of antibodies.

    PubMed

    Kanje, Sara; Hober, Sophia

    2015-04-01

    Antibodies are important molecules in many research fields, where they play a key role in various assays. Antibody labeling is therefore of great importance. Currently, most labeling techniques take advantage of certain amino acid side chains that commonly appear throughout proteins. This makes it hard to control the position and exact degree of labeling of each antibody. Hence, labeling of the antibody may affect the antibody-binding site. This paper presents a novel protein domain based on the IgG-binding domain C2 of streptococcal protein G, containing the unnatural amino acid BPA, that can cross-link other molecules. This novel domain can, with improved efficiency compared to previously reported similar domains, site-specifically cross-link to IgG at the Fc region. An efficient method for simultaneous in vivo incorporation of BPA and specific biotinylation in a flask cultivation of Escherichia coli is described. In comparison to a traditionally labeled antibody sample, the C2-labeled counterpart proved to have a higher proportion of functional antibodies when immobilized on a solid surface and the same limit of detection in an ELISA. This method of labeling is, due to its efficiency and simplicity, of high interest for all antibody-based assays where it is important that labeling does not interfere with the antibody-binding site. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Efficiency, ownership, and financing of hospitals: the case of Austria.

    PubMed

    Czypionka, Thomas; Kraus, Markus; Mayer, Susanne; Röhrling, Gerald

    2014-12-01

    While standard economic theory posits that privately owned hospitals are more efficient than their public counterparts, no clear conclusion can yet be drawn for Austria in this regard. As previous Austrian efficiency studies rely on data from the 1990s only and are based on small hospital samples, the generalizability of these results is questionable. To examine the impact of ownership type on efficiency, we apply a Data Envelopment Analysis which extends the existing literature in two respects: first, it evaluates the efficiency of the Austrian acute care sector, using data on 128 public and private non-profit hospitals from the year 2010; second, it additionally focusses on the inpatient sector alone, thus increasing the comparability between hospitals. Overall, the results show that in Austria, private non-profit hospitals outperform public hospitals in terms of technical efficiency. A multiple regression analysis confirms the significant association between efficiency and ownership type. This conclusive result contrasts some international evidence and can most likely be attributed to differences in financial incentives for public and private non-profit hospitals in Austria. Therefore, by drawing on the example of the Austrian acute care hospital sector and existing literature on the German acute care hospital sector, we also discuss the impact of hospital financing systems and their incentives on efficiency. This paper thus also aims at providing a proof of principle, pointing out the importance of the respective market conditions when internationally comparing hospital efficiency by ownership type.

  17. Short Communication An efficient method for simultaneous extraction of high-quality RNA and DNA from various plant tissues.

    PubMed

    Oliveira, R R; Viana, A J C; Reátegui, A C E; Vincentz, M G A

    2015-12-29

    Determination of gene expression is an important tool to study biological processes and relies on the quality of the extracted RNA. Changes in gene expression profiles may be directly related to mutations in regulatory DNA sequences or alterations in DNA cytosine methylation, which is an epigenetic mark. Correlation of gene expression with DNA sequence or epigenetic mark polymorphism is often desirable; for this, a robust protocol to isolate high-quality RNA and DNA simultaneously from the same sample is required. Although commercial kits and protocols are available, they are mainly optimized for animal tissues and, in general, restricted to RNA or DNA extraction, not both. In the present study, we describe an efficient and accessible method to extract both RNA and DNA simultaneously from the same sample of various plant tissues, using small amounts of starting material. The protocol was efficient in the extraction of high-quality nucleic acids from several Arabidopsis thaliana tissues (e.g., leaf, inflorescence stem, flower, fruit, cotyledon, seedlings, root, and embryo) and from other tissues of non-model plants, such as Avicennia schaueriana (Acanthaceae), Theobroma cacao (Malvaceae), Paspalum notatum (Poaceae), and Sorghum bicolor (Poaceae). The obtained nucleic acids were used as templates for downstream analyses, such as mRNA sequencing, quantitative real time-polymerase chain reaction, bisulfite treatment, and others; the results were comparable to those obtained with commercial kits. We believe that this protocol could be applied to a broad range of plant species, help avoid technical and sampling biases, and facilitate several RNA- and DNA-dependent analyses.

  18. Relative efficiency of unequal versus equal cluster sizes in cluster randomized trials using generalized estimating equation models.

    PubMed

    Liu, Jingxia; Colditz, Graham A

    2018-05-01

    There is growing interest in conducting cluster randomized trials (CRTs). For simplicity in sample size calculation, the cluster sizes are assumed to be identical across all clusters. However, equal cluster sizes are not guaranteed in practice. Therefore, the relative efficiency (RE) of unequal versus equal cluster sizes has been investigated when testing the treatment effect. One of the most important approaches to analyze a set of correlated data is the generalized estimating equation (GEE) proposed by Liang and Zeger, in which the "working correlation structure" is introduced and the association pattern depends on a vector of association parameters denoted by ρ. In this paper, we utilize GEE models to test the treatment effect in a two-group comparison for continuous, binary, or count data in CRTs. The variances of the estimator of the treatment effect are derived for the different types of outcome. RE is defined as the ratio of variance of the estimator of the treatment effect for equal to unequal cluster sizes. We discuss a commonly used structure in CRTs-exchangeable, and derive the simpler formula of RE with continuous, binary, and count outcomes. Finally, REs are investigated for several scenarios of cluster size distributions through simulation studies. We propose an adjusted sample size due to efficiency loss. Additionally, we also propose an optimal sample size estimation based on the GEE models under a fixed budget for known and unknown association parameter (ρ) in the working correlation structure within the cluster. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. The hydroxyl-functionalized magnetic particles for purification of glycan-binding proteins.

    PubMed

    Sun, Xiuxuan; Yang, Ganglong; Sun, Shisheng; Quan, Rui; Dai, Weiwei; Li, Bin; Chen, Chao; Li, Zheng

    2009-12-01

    Glycan-protein interactions play important biological roles in biological processes. Although there are some methods such as glycan arrays that may elucidate recognition events between carbohydrates and protein as well as screen the important glycan-binding proteins, there is a lack of simple effectively separate method to purify them from complex samples. In proteomics studies, fractionation of samples can help to reduce their complexity and to enrich specific classes of proteins for subsequent downstream analyses. Herein, a rapid simple method for purification of glycan-binding proteins from proteomic samples was developed using hydroxyl-coated magnetic particles coupled with underivatized carbohydrate. Firstly, the epoxy-coated magnetic particles were further hydroxyl functionalized with 4-hydroxybenzhydrazide, then the carbohydrates were efficiently immobilized on hydroxyl functionalized surface of magnetic particles by formation of glycosidic bond with the hemiacetal group at the reducing end of the suitable carbohydrates via condensation. All conditions of this method were optimized. The magnetic particle-carbohydrate conjugates were used to purify the glycan-binding proteins from human serum. The fractionated glycan-binding protein population was displayed by SDS-PAGE. The result showed that the amount of 1 mg magnetic particles coupled with mannose in acetate buffer (pH 5.4) was 10 micromol. The fractionated glycan-binding protein population in human serum could be eluted from the magnetic particle-mannose conjugates by 0.1% SDS. The methodology could work together with the glycan microarrays for screening and purification of the important GBPs from complex protein samples.

  20. Technique for fast and efficient hierarchical clustering

    DOEpatents

    Stork, Christopher

    2013-10-08

    A fast and efficient technique for hierarchical clustering of samples in a dataset includes compressing the dataset to reduce a number of variables within each of the samples of the dataset. A nearest neighbor matrix is generated to identify nearest neighbor pairs between the samples based on differences between the variables of the samples. The samples are arranged into a hierarchy that groups the samples based on the nearest neighbor matrix. The hierarchy is rendered to a display to graphically illustrate similarities or differences between the samples.

  1. Environmental Variations in the Atomic and Molecular Gas Radial Profiles of Nearby Spiral Galaxies

    NASA Astrophysics Data System (ADS)

    Mok, Angus; Wilson, Christine; JCMT Nearby Galaxies Legacy Survey

    2017-01-01

    We present an analysis of the radial profiles of a sample of 43 HI-flux selected spiral galaxies from the Nearby Galaxies Legacy Survey (NGLS) with resolved James Clerk Maxwell Telescope (JCMT) CO J= 3-2 and/or Very Large Array (VLA) HI maps. Comparing the Virgo and non-Virgo populations, we confirm that the HI disks are truncated in the Virgo sample, even for these relatively HI-rich galaxies. On the other hand, the H2 distribution is enhanced for Virgo galaxies near their centres, resulting in higher H2 to HI ratios and steeper H2 and total gas radial profiles. This is likely due to the effects of moderate ram pressure stripping in the cluster environment, which would preferentially remove low density gas in the outskirts while enhancing higher density gas near the centre. Combined with Hα star formation rate data, we find that the star formation efficiency (SFR/H2) is relatively constant with radius for both samples, but Virgo galaxies have a ˜40% lower star formation efficiency than non-Virgo galaxies. These results suggest that the environment of spiral galaxies can play an important role in the formation of molecular gas and the star formation process.

  2. Advances in understanding the surface chemistry of lignocellulosic biomass via time-of-flight secondary ion mass spectrometry

    DOE PAGES

    Tolbert, Allison K.; Ragauskas, Arthur J.

    2016-12-12

    Overcoming the natural recalcitrance of lignocellulosic biomass is necessary in order to efficiently convert biomass into biofuels or biomaterials and many times this requires some type of chemical pretreatment and/or biological treatment. While bulk chemical analysis is the traditional method of determining the impact a treatment has on biomass, the chemistry on the surface of the sample can differ from the bulk chemistry. Specifically, enzymes and microorganisms bind to the surface of the biomass and their efficiency could be greatly impacted by the chemistry of the surface. Therefore, it is important to study and understand the chemistry of the biomassmore » at the surface. Time-of- flight secondary ion mass spectrometry (ToF-SIMS) is a powerful tool that can spectrally and spatially analyze the surface chemistry of a sample. This review discusses the advances in understanding lignocellulosic biomass surface chemistry using the ToF-SIMS by addressing the instrument parameters, biomass sample preparation, and characteristic lignocellulosic ion fragmentation peaks along with their typical location in the plant cell wall. Furthermore, the use of the ToF-SIMS in detecting chemical changes due to chemical pretreatments, microbial treatments, and physical or genetic modifications is discussed along with possible future applications of the instrument in lignocellulosic biomass studies.« less

  3. Entropy-Based Search Algorithm for Experimental Design

    NASA Astrophysics Data System (ADS)

    Malakar, N. K.; Knuth, K. H.

    2011-03-01

    The scientific method relies on the iterated processes of inference and inquiry. The inference phase consists of selecting the most probable models based on the available data; whereas the inquiry phase consists of using what is known about the models to select the most relevant experiment. Optimizing inquiry involves searching the parameterized space of experiments to select the experiment that promises, on average, to be maximally informative. In the case where it is important to learn about each of the model parameters, the relevance of an experiment is quantified by Shannon entropy of the distribution of experimental outcomes predicted by a probable set of models. If the set of potential experiments is described by many parameters, we must search this high-dimensional entropy space. Brute force search methods will be slow and computationally expensive. We present an entropy-based search algorithm, called nested entropy sampling, to select the most informative experiment for efficient experimental design. This algorithm is inspired by Skilling's nested sampling algorithm used in inference and borrows the concept of a rising threshold while a set of experiment samples are maintained. We demonstrate that this algorithm not only selects highly relevant experiments, but also is more efficient than brute force search. Such entropic search techniques promise to greatly benefit autonomous experimental design.

  4. Advances in understanding the surface chemistry of lignocellulosic biomass via time-of-flight secondary ion mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tolbert, Allison K.; Ragauskas, Arthur J.

    Overcoming the natural recalcitrance of lignocellulosic biomass is necessary in order to efficiently convert biomass into biofuels or biomaterials and many times this requires some type of chemical pretreatment and/or biological treatment. While bulk chemical analysis is the traditional method of determining the impact a treatment has on biomass, the chemistry on the surface of the sample can differ from the bulk chemistry. Specifically, enzymes and microorganisms bind to the surface of the biomass and their efficiency could be greatly impacted by the chemistry of the surface. Therefore, it is important to study and understand the chemistry of the biomassmore » at the surface. Time-of- flight secondary ion mass spectrometry (ToF-SIMS) is a powerful tool that can spectrally and spatially analyze the surface chemistry of a sample. This review discusses the advances in understanding lignocellulosic biomass surface chemistry using the ToF-SIMS by addressing the instrument parameters, biomass sample preparation, and characteristic lignocellulosic ion fragmentation peaks along with their typical location in the plant cell wall. Furthermore, the use of the ToF-SIMS in detecting chemical changes due to chemical pretreatments, microbial treatments, and physical or genetic modifications is discussed along with possible future applications of the instrument in lignocellulosic biomass studies.« less

  5. Efficient Monte Carlo Estimation of the Expected Value of Sample Information Using Moment Matching.

    PubMed

    Heath, Anna; Manolopoulou, Ioanna; Baio, Gianluca

    2018-02-01

    The Expected Value of Sample Information (EVSI) is used to calculate the economic value of a new research strategy. Although this value would be important to both researchers and funders, there are very few practical applications of the EVSI. This is due to computational difficulties associated with calculating the EVSI in practical health economic models using nested simulations. We present an approximation method for the EVSI that is framed in a Bayesian setting and is based on estimating the distribution of the posterior mean of the incremental net benefit across all possible future samples, known as the distribution of the preposterior mean. Specifically, this distribution is estimated using moment matching coupled with simulations that are available for probabilistic sensitivity analysis, which is typically mandatory in health economic evaluations. This novel approximation method is applied to a health economic model that has previously been used to assess the performance of other EVSI estimators and accurately estimates the EVSI. The computational time for this method is competitive with other methods. We have developed a new calculation method for the EVSI which is computationally efficient and accurate. This novel method relies on some additional simulation so can be expensive in models with a large computational cost.

  6. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  7. A Novel Energy-Efficient Approach for Human Activity Recognition.

    PubMed

    Zheng, Lingxiang; Wu, Dihong; Ruan, Xiaoyang; Weng, Shaolin; Peng, Ao; Tang, Biyu; Lu, Hai; Shi, Haibin; Zheng, Huiru

    2017-09-08

    In this paper, we propose a novel energy-efficient approach for mobile activity recognition system (ARS) to detect human activities. The proposed energy-efficient ARS, using low sampling rates, can achieve high recognition accuracy and low energy consumption. A novel classifier that integrates hierarchical support vector machine and context-based classification (HSVMCC) is presented to achieve a high accuracy of activity recognition when the sampling rate is less than the activity frequency, i.e., the Nyquist sampling theorem is not satisfied. We tested the proposed energy-efficient approach with the data collected from 20 volunteers (14 males and six females) and the average recognition accuracy of around 96.0% was achieved. Results show that using a low sampling rate of 1Hz can save 17.3% and 59.6% of energy compared with the sampling rates of 5 Hz and 50 Hz. The proposed low sampling rate approach can greatly reduce the power consumption while maintaining high activity recognition accuracy. The composition of power consumption in online ARS is also investigated in this paper.

  8. Fog collecting biomimetic surfaces: Influence of microstructure and wettability.

    PubMed

    Azad, M A K; Ellerbrok, D; Barthlott, W; Koch, K

    2015-01-19

    We analyzed the fog collection efficiency of three different sets of samples: replica (with and without microstructures), copper wire (smooth and microgrooved) and polyolefin mesh (hydrophilic, superhydrophilic and hydrophobic). The collection efficiency of the samples was compared in each set separately to investigate the influence of microstructures and/or the wettability of the surfaces on fog collection. Based on the controlled experimental conditions chosen here large differences in the efficiency were found. We found that microstructured plant replica samples collected 2-3 times higher amounts of water than that of unstructured (smooth) samples. Copper wire samples showed similar results. Moreover, microgrooved wires had a faster dripping of water droplets than that of smooth wires. The superhydrophilic mesh tested here was proved more efficient than any other mesh samples with different wettability. The amount of collected fog by superhydrophilic mesh was about 5 times higher than that of hydrophilic (untreated) mesh and was about 2 times higher than that of hydrophobic mesh.

  9. Evaluation of environmental sampling methods for detection of Salmonella enterica in a large animal veterinary hospital.

    PubMed

    Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey

    2018-04-01

    Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.

  10. Tangential flow ultrafiltration for detection of white spot syndrome virus (WSSV) in shrimp pond water.

    PubMed

    Alavandi, S V; Ananda Bharathi, R; Satheesh Kumar, S; Dineshkumar, N; Saravanakumar, C; Joseph Sahaya Rajan, J

    2015-06-15

    Water represents the most important component in the white spot syndrome virus (WSSV) transmission pathway in aquaculture, yet there is very little information. Detection of viruses in water is a challenge, since their counts will often be too low to be detected by available methods such as polymerase chain reaction (PCR). In order to overcome this difficulty, viruses in water have to be concentrated from large volumes of water prior to detection. In this study, a total of 19 water samples from aquaculture ecosystem comprising 3 creeks, 10 shrimp culture ponds, 3 shrimp broodstock tanks and 2 larval rearing tanks of shrimp hatcheries and a sample from a hatchery effluent treatment tank were subjected to concentration of viruses by ultrafiltration (UF) using tangential flow filtration (TFF). Twenty to 100l of water from these sources was concentrated to a final volume of 100mL (200-1000 fold). The efficiency of recovery of WSSV by TFF ranged from 7.5 to 89.61%. WSSV could be successfully detected by PCR in the viral concentrates obtained from water samples of three shrimp culture ponds, one each of the shrimp broodstock tank, larval rearing tank, and the shrimp hatchery effluent treatment tank with WSSV copy numbers ranging from 6 to 157mL(-1) by quantitative real time PCR. The ultrafiltration virus concentration technique enables efficient detection of shrimp viral pathogens in water from aquaculture facilities. It could be used as an important tool to understand the efficacy of biosecurity protocols adopted in the aquaculture facility and to carry out epidemiological investigations of aquatic viral pathogens. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Quantification of online removal of refractory black carbon using laser-induced incandescence in the single particle soot photometer

    DOE PAGES

    Aiken, Allison C.; McMeeking, Gavin R.; Levin, Ezra J. T.; ...

    2016-04-05

    Refractory black carbon (rBC) is an aerosol that has important impacts on climate and human health. rBC is often mixed with other species, making it difficult to isolate and quantify its important effects on physical and optical properties of ambient aerosol. To solve this measurement challenge, a new method to remove rBC was developed using laser-induced incandescence (LII) by Levin et al. in 2014. Application of the method with the Single Particle Soot Photometer (SP2) is used to determine the effects of rBC on ice nucleating particles (INP). Here, we quantify the efficacy of the method in the laboratory usingmore » the rBC surrogate Aquadag. Polydisperse and mobility-selected samples (100–500 nm diameter, 0.44–36.05 fg), are quantified by a second SP2. Removal rates are reported by mass and number. For the mobility-selected samples, the average percentages removed by mass and number of the original size are 88.9 ± 18.6% and 87.3 ± 21.9%, respectively. Removal of Aquadag is efficient for particles >100 nm mass-equivalent diameter (d me), enabling application for microphysical studies. However, the removal of particles ≤100 nm d me is less efficient. Absorption and scattering measurements are reported to assess its use to isolate brown carbon (BrC) absorption. Scattering removal rates for the mobility-selected samples are >90% on average, yet absorption rates are 53% on average across all wavelengths. Therefore, application to isolate effects of microphysical properties determined by larger sizes is promising, but will be challenging for optical properties. Lastly, the results reported also have implications for other instruments employing internal LII, e.g., the Soot Particle Aerosol Mass Spectrometer (SP-AMS).« less

  12. Evaluation of three traps for sampling Aedes polynesiensis and other mosquito species in American Samoa.

    PubMed

    Schmaedick, Mark A; Ball, Tamara S; Burkot, Thomas R; Gurr, Neil E

    2008-06-01

    The efficacy of the recently developed BG-Sentinel mosquito trap baited with BG-Lure (a combination of lactic acid, ammonia, and caproic acid) was evaluated in American Samoa against the omnidirectional Fay-Prince trap and the Centers for Disease Control and Prevention (CDC) light trap, both baited with carbon dioxide. The BG-Sentinel trap captured the greatest number of the important filariasis and dengue vector Aedes (Stegomyia) polynesiensis at all 3 collection locations; however, its catch rate was not significantly different from that of the Fay-Prince trap at 2 of the 3 trapping locations. The CDC light trap caught very few Ae. polynesiensis. The Fay-Prince trap was more efficient than the other 2 traps for collecting Aedes (Aedimorphus) nocturnus, Aedes (Finlaya) spp., Culex quinquefasciatus, and Culex annulirostris. The efficacy and convenience of the BG-Sentinel suggest further research is warranted to evaluate its potential as a possible efficient and safe alternative to landing catches for sampling Ae. polynesiensis in research and control efforts against filariasis and dengue in the South Pacific.

  13. Sieve efficiency in benthic sampling as related to chironomid head capsule width

    USGS Publications Warehouse

    Hudson, Patrick L.; Adams, Jean V.

    1998-01-01

    The width of the head capsule in chironomid larvae is the most important morphometric character controlling retention of specimens in sieving devices. Knowledge of the range in size of these widths within any chironomid community is fundamental to sampling and interpreting the resulting data. We present the head capsule widths of 30 species of chironomids and relate their size distribution to loss or retention in several experiments using graded sieve sizes. Based on our measurements and those found in the literature we found the head capsule width of fourth instars in half the chironomids species to be less than 350 I?m. Many species may never be collected with the commonly used U.S. Standard No. 30 sieve (589 I?m), and the No. 60 (246 I?m) screen appears to retain most species only qualitatively. We found 70 to 90% of the chironomid larvae and 19 to 34% of their biomass can pass through a No. 80 sieve (177 I?m). The implications of sieve loss and other factors affecting sieving efficiency are discussed.

  14. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  15. Efficient Robust Regression via Two-Stage Generalized Empirical Likelihood

    PubMed Central

    Bondell, Howard D.; Stefanski, Leonard A.

    2013-01-01

    Large- and finite-sample efficiency and resistance to outliers are the key goals of robust statistics. Although often not simultaneously attainable, we develop and study a linear regression estimator that comes close. Efficiency obtains from the estimator’s close connection to generalized empirical likelihood, and its favorable robustness properties are obtained by constraining the associated sum of (weighted) squared residuals. We prove maximum attainable finite-sample replacement breakdown point, and full asymptotic efficiency for normal errors. Simulation evidence shows that compared to existing robust regression estimators, the new estimator has relatively high efficiency for small sample sizes, and comparable outlier resistance. The estimator is further illustrated and compared to existing methods via application to a real data set with purported outliers. PMID:23976805

  16. Sampling Methods for Detection and Monitoring of the Asian Citrus Psyllid (Hemiptera: Psyllidae).

    PubMed

    Monzo, C; Arevalo, H A; Jones, M M; Vanaclocha, P; Croxton, S D; Qureshi, J A; Stansly, P A

    2015-06-01

    The Asian citrus psyllid (ACP), Diaphorina citri Kuwayama is a key pest of citrus due to its role as vector of citrus greening disease or "huanglongbing." ACP monitoring is considered an indispensable tool for management of vector and disease. In the present study, datasets collected between 2009 and 2013 from 245 citrus blocks were used to evaluate precision, sensitivity for detection, and efficiency of five sampling methods. The number of samples needed to reach a 0.25 standard error-mean ratio was estimated using Taylor's power law and used to compare precision among sampling methods. Comparison of detection sensitivity and time expenditure (cost) between stem-tap and other sampling methodologies conducted consecutively at the same location were also assessed. Stem-tap sampling was the most efficient sampling method when ACP densities were moderate to high and served as the basis for comparison with all other methods. Protocols that grouped trees near randomly selected locations across the block were more efficient than sampling trees at random across the block. Sweep net sampling was similar to stem-taps in number of captures per sampled unit, but less precise at any ACP density. Yellow sticky traps were 14 times more sensitive than stem-taps but much more time consuming and thus less efficient except at very low population densities. Visual sampling was efficient for detecting and monitoring ACP at low densities. Suction sampling was time consuming and taxing but the most sensitive of all methods for detection of sparse populations. This information can be used to optimize ACP monitoring efforts. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Comparative Efficiency of the Fenwick Can and Schuiling Centrifuge in Extracting Nematode Cysts from Different Soil Types

    PubMed Central

    Bellvert, Joaquim; Crombie, Kieran; Horgan, Finbarr G.

    2008-01-01

    The Fenwick can and Schuiling centrifuge are widely used to extract nematode cysts from soil samples. The comparative efficiencies of these two methods during cyst extraction have not been determined for different soil types under different cyst densities. Such information is vital for statutory laboratories that must choose a method for routine, high-throughput soil monitoring. In this study, samples of different soil types seeded with varying densities of potato cyst nematode (Globodera rostochiensis) cysts were processed using both methods. In one experiment, with 200 ml samples, recovery was similar between methods. In a second experiment with 500 ml samples, cyst recovery was higher using the Schuiling centrifuge. For each method and soil type, cyst extraction efficiency was similar across all densities tested. Extraction was efficient from pure sand (Fenwick 72%, Schuiling 84%) and naturally sandy soils (Fenwick 62%, Schuiling 73%), but was significantly less efficient from clay-soil (Fenwick 42%, Schuiling 44%) and peat-soil with high organic matter content (Fenwick 35%, Schuiling 33%). Residual moisture (<10% w/w) in samples prior to analyses reduced extraction efficiency, particularly for sand and sandy soils. For each soil type and method, there were significant linear relationships between the number of cysts extracted and the numbers of cysts in the samples. We discuss the advantages and disadvantages of each extraction method for cyst extraction in statutory soil laboratories. PMID:19259516

  18. Long Term Resource Monitoring Program procedures: fish monitoring

    USGS Publications Warehouse

    Ratcliff, Eric N.; Glittinger, Eric J.; O'Hara, T. Matt; Ickes, Brian S.

    2014-01-01

    This manual constitutes the second revision of the U.S. Army Corps of Engineers’ Upper Mississippi River Restoration-Environmental Management Program (UMRR-EMP) Long Term Resource Monitoring Program (LTRMP) element Fish Procedures Manual. The original (1988) manual merged and expanded on ideas and recommendations related to Upper Mississippi River fish sampling presented in several early documents. The first revision to the manual was made in 1995 reflecting important protocol changes, such as the adoption of a stratified random sampling design. The 1995 procedures manual has been an important document through the years and has been cited in many reports and scientific manuscripts. The resulting data collected by the LTRMP fish component represent the largest dataset on fish within the Upper Mississippi River System (UMRS) with more than 44,000 collections of approximately 5.7 million fish. The goal of this revision of the procedures manual is to document changes in LTRMP fish sampling procedures since 1995. Refinements to sampling methods become necessary as monitoring programs mature. Possible refinements are identified through field experiences (e.g., sampling techniques and safety protocols), data analysis (e.g., planned and studied gear efficiencies and reallocations of effort), and technological advances (e.g., electronic data entry). Other changes may be required because of financial necessity (i.e., unplanned effort reductions). This version of the LTRMP fish monitoring manual describes the most current (2014) procedures of the LTRMP fish component.

  19. Nanotechnology-Based Surface Plasmon Resonance Affinity Biosensors for In Vitro Diagnostics

    PubMed Central

    Antiochia, Riccarda; Bollella, Paolo; Favero, Gabriele

    2016-01-01

    In the last decades, in vitro diagnostic devices (IVDDs) became a very important tool in medicine for an early and correct diagnosis, a proper screening of targeted population, and also assessing the efficiency of a specific therapy. In this review, the most recent developments regarding different configurations of surface plasmon resonance affinity biosensors modified by using several nanostructured materials for in vitro diagnostics are critically discussed. Both assembly and performances of the IVDDs tested in biological samples are reported and compared. PMID:27594884

  20. Developing and Evaluating the HRM Technique for Identifying Cytochrome P450 2D6 Polymorphisms.

    PubMed

    Lu, Hsiu-Chin; Chang, Ya-Sian; Chang, Chun-Chi; Lin, Ching-Hsiung; Chang, Jan-Gowth

    2015-05-01

    Cytochrome P450 2D6 is one of the important enzymes involved in the metabolism of many widely used drugs. Genetic polymorphisms of CYP2D6 can affect its activity. Therefore, an efficient method for identifying CYP2D6 polymorphisms is clinically important. We developed a high-resolution melting (HRM) analysis to investigate CYP2D6 polymorphisms. Genomic DNA was extracted from peripheral blood samples from 71 healthy individuals. All nine exons of the CYP2D6 gene were sequenced before screening by HRM analysis. This method can detect the most genotypes (*1, *2, *4, *10, *14, *21 *39, and *41) of CYP2D6 in Chinese. All samples were successfully genotyped. The four most common mutant CYP2D6 alleles (*1, *2, *10, and *41) can be genotyped. The single nucleotides polymorphism (SNP) frequencies of 100C > T (rs1065852), 1039C > T (rs1081003), 1661G > C (rs1058164), 2663G > A (rs28371722), 2850C > T (rs16947), 2988G > A (rs28371725), 3181A > G, and 4180G > C (rs1135840) were 58%, 61%, 73%, 1%, 13%, 3%, 1%, 73%, respectively. We identified 100% of all heterozygotes without any errors. The two homozygous genotypes (1661G > C and 4180G > C) can be distinguished by mixing with a known genotype sample to generate an artificial heterozygote for HRM analysis. Therefore, all samples could be identified using our HRM method, and the results of HRM analysis are identical to those obtained by sequencing. Our method achieved 100% sensitivity, specificity, positive prediction value and negative prediction value. HRM analysis is a nongel resolution method that is faster and less expensive than direct sequencing. Our study shows that it is an efficient tool for typing CYP2D6 polymorphisms. © 2014 Wiley Periodicals, Inc.

  1. An efficient Agrobacterium-mediated transformation method for aflatoxin generation fungus Aspergillus flavus.

    PubMed

    Han, Guomin; Shao, Qian; Li, Cuiping; Zhao, Kai; Jiang, Li; Fan, Jun; Jiang, Haiyang; Tao, Fang

    2018-05-01

    Aspergillus flavus often invade many important corps and produce harmful aflatoxins both in preharvest and during storage stages. The regulation mechanism of aflatoxin biosynthesis in this fungus has not been well explored mainly due to the lack of an efficient transformation method for constructing a genome-wide gene mutant library. This challenge was resolved in this study, where a reliable and efficient Agrobacterium tumefaciens-mediated transformation (ATMT) protocol for A. flavus NRRL 3357 was established. The results showed that removal of multinucleate conidia, to collect a homogenous sample of uninucleate conidia for use as the transformation material, is the key step in this procedure. A. tumefaciens strain AGL-1 harboring the ble gene for zeocin resistance under the control of the gpdA promoter from A. nidulans is suitable for genetic transformation of this fungus. We successfully generated A. flavus transformants with an efficiency of ∼ 60 positive transformants per 10 6 conidia using our protocol. A small-scale insertional mutant library (∼ 1,000 mutants) was constructed using this method and the resulting several mutants lacked both production of conidia and aflatoxin biosynthesis capacity. Southern blotting analysis demonstrated that the majority of the transformants contained a single T-DNA insert on the genome. To the best of our knowledge, this is the first report of genetic transformation of A. flavus via ATMT and our protocol provides an effective tool for construction of genome-wide gene mutant libraries for functional analysis of important genes in A. flavus.

  2. Sample size considerations for clinical research studies in nuclear cardiology.

    PubMed

    Chiuzan, Cody; West, Erin A; Duong, Jimmy; Cheung, Ken Y K; Einstein, Andrew J

    2015-12-01

    Sample size calculation is an important element of research design that investigators need to consider in the planning stage of the study. Funding agencies and research review panels request a power analysis, for example, to determine the minimum number of subjects needed for an experiment to be informative. Calculating the right sample size is crucial to gaining accurate information and ensures that research resources are used efficiently and ethically. The simple question "How many subjects do I need?" does not always have a simple answer. Before calculating the sample size requirements, a researcher must address several aspects, such as purpose of the research (descriptive or comparative), type of samples (one or more groups), and data being collected (continuous or categorical). In this article, we describe some of the most frequent methods for calculating the sample size with examples from nuclear cardiology research, including for t tests, analysis of variance (ANOVA), non-parametric tests, correlation, Chi-squared tests, and survival analysis. For the ease of implementation, several examples are also illustrated via user-friendly free statistical software.

  3. Monitoring nekton as a bioindicator in shallow estuarine habitats

    USGS Publications Warehouse

    Raposa, K.B.; Roman, C.T.; Heltshe, J.F.

    2003-01-01

    Long-term monitoring of estuarine nekton has many practical and ecological benefits but efforts are hampered by a lack of standardized sampling procedures. This study provides a rationale for monitoring nekton in shallow (< 1 m), temperate, estuarine habitats and addresses some important issues that arise when developing monitoring protocols. Sampling in seagrass and salt marsh habitats is emphasized due to the susceptibility of each habitat to anthropogenic stress and to the abundant and rich nekton assemblages that each habitat supports. Extensive sampling with quantitative enclosure traps that estimate nekton density is suggested. These gears have a high capture efficiency in most habitats and are small enough (e.g., 1 m(2)) to permit sampling in specific microhabitats. Other aspects of nekton monitoring are discussed, including spatial and temporal sampling considerations, station selection, sample size estimation, and data collection and analysis. Developing and initiating long-term nekton monitoring programs will help evaluate natural and human-induced changes in estuarine nekton over time and advance our understanding of the interactions between nekton and the dynamic estuarine environment.

  4. Integrated DNA and RNA extraction and purification on an automated microfluidic cassette from bacterial and viral pathogens causing community-acquired lower respiratory tract infections.

    PubMed

    Van Heirstraeten, Liesbet; Spang, Peter; Schwind, Carmen; Drese, Klaus S; Ritzi-Lehnert, Marion; Nieto, Benjamin; Camps, Marta; Landgraf, Bryan; Guasch, Francesc; Corbera, Antoni Homs; Samitier, Josep; Goossens, Herman; Malhotra-Kumar, Surbhi; Roeser, Tina

    2014-05-07

    In this paper, we describe the development of an automated sample preparation procedure for etiological agents of community-acquired lower respiratory tract infections (CA-LRTI). The consecutive assay steps, including sample re-suspension, pre-treatment, lysis, nucleic acid purification, and concentration, were integrated into a microfluidic lab-on-a-chip (LOC) cassette that is operated hands-free by a demonstrator setup, providing fluidic and valve actuation. The performance of the assay was evaluated on viral and Gram-positive and Gram-negative bacterial broth cultures previously sampled using a nasopharyngeal swab. Sample preparation on the microfluidic cassette resulted in higher or similar concentrations of pure bacterial DNA or viral RNA compared to manual benchtop experiments. The miniaturization and integration of the complete sample preparation procedure, to extract purified nucleic acids from real samples of CA-LRTI pathogens to, and above, lab quality and efficiency, represent important steps towards its application in a point-of-care test (POCT) for rapid diagnosis of CA-LRTI.

  5. Improving inference for aerial surveys of bears: The importance of assumptions and the cost of unnecessary complexity.

    PubMed

    Schmidt, Joshua H; Wilson, Tammy L; Thompson, William L; Reynolds, Joel H

    2017-07-01

    Obtaining useful estimates of wildlife abundance or density requires thoughtful attention to potential sources of bias and precision, and it is widely understood that addressing incomplete detection is critical to appropriate inference. When the underlying assumptions of sampling approaches are violated, both increased bias and reduced precision of the population estimator may result. Bear ( Ursus spp.) populations can be difficult to sample and are often monitored using mark-recapture distance sampling (MRDS) methods, although obtaining adequate sample sizes can be cost prohibitive. With the goal of improving inference, we examined the underlying methodological assumptions and estimator efficiency of three datasets collected under an MRDS protocol designed specifically for bears. We analyzed these data using MRDS, conventional distance sampling (CDS), and open-distance sampling approaches to evaluate the apparent bias-precision tradeoff relative to the assumptions inherent under each approach. We also evaluated the incorporation of informative priors on detection parameters within a Bayesian context. We found that the CDS estimator had low apparent bias and was more efficient than the more complex MRDS estimator. When combined with informative priors on the detection process, precision was increased by >50% compared to the MRDS approach with little apparent bias. In addition, open-distance sampling models revealed a serious violation of the assumption that all bears were available to be sampled. Inference is directly related to the underlying assumptions of the survey design and the analytical tools employed. We show that for aerial surveys of bears, avoidance of unnecessary model complexity, use of prior information, and the application of open population models can be used to greatly improve estimator performance and simplify field protocols. Although we focused on distance sampling-based aerial surveys for bears, the general concepts we addressed apply to a variety of wildlife survey contexts.

  6. Elemental, Isotopic, and Organic Analysis on Mars with Laser TOF-MS

    NASA Technical Reports Server (NTRS)

    Brinckerhoff, W. B.; Cornish, T. J.

    2000-01-01

    The in-depth landed exploration of Mars will require increasingly sophisticated robotic analytical tools for both in situ composition science [1] and reconnaissance for sample return [2]. Beyond dust, rock surfaces, and topsoil, samples must be accessed within rocks and ice, well below surface soil, and possibly in elevated deposit layers. A range of spatial scales will be studied, and for the most information-rich microscopic analyses, samples must be acquired, prepared, and positioned with high precision. In some cases samples must also be brought into a vacuum chamber. After expending such resources, it will be important to apply techniques that provide a wide range of information about the samples. Microscopy, mineralogy, and molecular/organic, elemental, and isotopic analyses are all needed, at a minimum, to begin to address the in situ goals at Mars. These techniques must work as an efficient suite to provide layers of data, each layer helping to determine if further analysis on a given sample is desired. In the spirit of broad-band and efficient data collection, we are developing miniature laser time-of-flight mass spectrometers (TOF-MS) for elemental, isotopic, and molecular/organic microanalysis of unprepared solid samples. Laser TOF-MS uses a pulsed laser to volatilize and ionize material from a small region on the sample. The laser energy and focus can be adjusted for atomic and molecular content, sampling area, and depth. Ions travel through the instrument and are detected at a sequence of times proportional to the square root of their mass-to- charge ratios. Thus, each laser pulse produces a complete mass spectrum (in less than 50 microseconds). These instruments can now be significantly miniaturized (potentially to the size of a soda can) without a loss in performance. This effort is reviewed here with an emphasis on applications to Mars exploration.

  7. Cryptosporidium Oocyst Detection in Water Samples: Floatation Technique Enhanced with Immunofluorescence Is as Effective as Immunomagnetic Separation Method

    PubMed Central

    Koompapong, Khuanchai; Sutthikornchai, Chantira

    2009-01-01

    Cryptosporidium can cause gastrointestinal diseases worldwide, consequently posing public health problems and economic burden. Effective techniques for detecting contaminated oocysts in water are important to prevent and control the contamination. Immunomagnetic separation (IMS) method has been widely employed recently due to its efficiency, but, it is costly. Sucrose floatation technique is generally used for separating organisms by using their different specific gravity. It is effective and cheap but time consuming as well as requiring highly skilled personnel. Water turbidity and parasite load in water sample are additional factors affecting to the recovery rate of those 2 methods. We compared the efficiency of IMS and sucrose floatation methods to recover the spiked Cryptosporidium oocysts in various turbidity water samples. Cryptosporidium oocysts concentration at 1, 101, 102, and 103 per 10 µl were spiked into 3 sets of 10 ml-water turbidity (5, 50, and 500 NTU). The recovery rate of the 2 methods was not different. Oocyst load at the concentration < 102 per 10 ml yielded unreliable results. Water turbidity at 500 NTU decreased the recovery rate of both techniques. The combination of sucrose floatation and immunofluorescense assay techniques (SF-FA) showed higher recovery rate than IMS and immunofluorescense assay (IMS-FA). We used this SF-FA to detect Cryptosporidium and Giardia from the river water samples and found 9 and 19 out of 30 (30% and 63.3%) positive, respectively. Our results favored sucrose floatation technique enhanced with immunofluorescense assay for detecting contaminated protozoa in water samples in general laboratories and in the real practical setting. PMID:19967082

  8. Cryptosporidium oocyst detection in water samples: floatation technique enhanced with immunofluorescence is as effective as immunomagnetic separation method.

    PubMed

    Koompapong, Khuanchai; Sutthikornchai, Chantira; Sukthana, Yowalark

    2009-12-01

    Cryptosporidium can cause gastrointestinal diseases worldwide, consequently posing public health problems and economic burden. Effective techniques for detecting contaminated oocysts in water are important to prevent and control the contamination. Immunomagnetic separation (IMS) method has been widely employed recently due to its efficiency, but, it is costly. Sucrose floatation technique is generally used for separating organisms by using their different specific gravity. It is effective and cheap but time consuming as well as requiring highly skilled personnel. Water turbidity and parasite load in water sample are additional factors affecting to the recovery rate of those 2 methods. We compared the efficiency of IMS and sucrose floatation methods to recover the spiked Cryptosporidium oocysts in various turbidity water samples. Cryptosporidium oocysts concentration at 1, 10(1), 10(2), and 10(3) per 10 microl were spiked into 3 sets of 10 ml-water turbidity (5, 50, and 500 NTU). The recovery rate of the 2 methods was not different. Oocyst load at the concentration < 10(2) per 10 ml yielded unreliable results. Water turbidity at 500 NTU decreased the recovery rate of both techniques. The combination of sucrose floatation and immunofluorescense assay techniques (SF-FA) showed higher recovery rate than IMS and immunofluorescense assay (IMS-FA). We used this SF-FA to detect Cryptosporidium and Giardia from the river water samples and found 9 and 19 out of 30 (30% and 63.3%) positive, respectively. Our results favored sucrose floatation technique enhanced with immunofluorescense assay for detecting contaminated protozoa in water samples in general laboratories and in the real practical setting.

  9. Sequential recovery of macromolecular components of the nucleolus.

    PubMed

    Bai, Baoyan; Laiho, Marikki

    2015-01-01

    The nucleolus is involved in a number of cellular processes of importance to cell physiology and pathology, including cell stress responses and malignancies. Studies of macromolecular composition of the nucleolus depend critically on the efficient extraction and accurate quantification of all macromolecular components (e.g., DNA, RNA, and protein). We have developed a TRIzol-based method that efficiently and simultaneously isolates these three macromolecular constituents from the same sample of purified nucleoli. The recovered and solubilized protein can be accurately quantified by the bicinchoninic acid assay and assessed by polyacrylamide gel electrophoresis or by mass spectrometry. We have successfully applied this approach to extract and quantify the responses of all three macromolecular components in nucleoli after drug treatments of HeLa cells, and conducted RNA-Seq analysis of the nucleolar RNA.

  10. Pairing call-response surveys and distance sampling for a mammalian carnivore

    USGS Publications Warehouse

    Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.

    2015-01-01

    Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.

  11. Design and Field Procedures in the US National Comorbidity Survey Replication Adolescent Supplement (NCS-A)

    PubMed Central

    Kessler, Ronald C.; Avenevoli, Shelli; Costello, E. Jane; Green, Jennifer Greif; Gruber, Michael J.; Heeringa, Steven; Merikangas, Kathleen R.; Pennell, Beth-Ellen; Sampson, Nancy A.; Zaslavsky, Alan M.

    2009-01-01

    An overview is presented of the design and field procedures of the US National Comorbidity Survey Replication Adolescent Supplement (NCS-A), a US face-to-face household survey of the prevalence and correlates of DSM-IV mental disorders. The survey was based on a dual-frame design that included 904 adolescent residents of the households that participated in the US National Comorbidity Survey Replication (85.9% response rate) and 9,244 adolescent students selected from a nationally representative sample of 320 schools (74.7% response rate). After expositing the logic of dual-frame designs, comparisons are presented of sample and population distributions on Census socio-demographic variables and, in the school sample, school characteristics. These document only minor differences between the samples and the population. The results of statistical analysis of the bias-efficiency trade-off in weight trimming are then presented. These show that modest trimming meaningfully reduces mean squared error. Analysis of comparative sample efficiency shows that the household sample is more efficient than the school sample, leading to the household sample getting a higher weight relative to its size in the consolidated sample relative to the school sample. Taken together, these results show that the NCS-A is an efficient sample of the target population with good representativeness on a range of socio-demographic and geographic variables. PMID:19507169

  12. Urban Land Cover Mapping Accuracy Assessment - A Cost-benefit Analysis Approach

    NASA Astrophysics Data System (ADS)

    Xiao, T.

    2012-12-01

    One of the most important components in urban land cover mapping is mapping accuracy assessment. Many statistical models have been developed to help design simple schemes based on both accuracy and confidence levels. It is intuitive that an increased number of samples increases the accuracy as well as the cost of an assessment. Understanding cost and sampling size is crucial in implementing efficient and effective of field data collection. Few studies have included a cost calculation component as part of the assessment. In this study, a cost-benefit sampling analysis model was created by combining sample size design and sampling cost calculation. The sampling cost included transportation cost, field data collection cost, and laboratory data analysis cost. Simple Random Sampling (SRS) and Modified Systematic Sampling (MSS) methods were used to design sample locations and to extract land cover data in ArcGIS. High resolution land cover data layers of Denver, CO and Sacramento, CA, street networks, and parcel GIS data layers were used in this study to test and verify the model. The relationship between the cost and accuracy was used to determine the effectiveness of each sample method. The results of this study can be applied to other environmental studies that require spatial sampling.

  13. An Agile Functional Analysis of Metagenomic Data Using SUPER-FOCUS.

    PubMed

    Silva, Genivaldo Gueiros Z; Lopes, Fabyano A C; Edwards, Robert A

    2017-01-01

    One of the main goals in metagenomics is to identify the functional profile of a microbial community from unannotated shotgun sequencing reads. Functional annotation is important in biological research because it enables researchers to identify the abundance of functional genes of the organisms present in the sample, answering the question, "What can the organisms in the sample do?" Most currently available approaches do not scale with increasing data volumes, which is important because both the number and lengths of the reads provided by sequencing platforms keep increasing. Here, we present SUPER-FOCUS, SUbsystems Profile by databasE Reduction using FOCUS, an agile homology-based approach using a reduced reference database to report the subsystems present in metagenomic datasets and profile their abundances. SUPER-FOCUS was tested with real metagenomes, and the results show that it accurately predicts the subsystems present in the profiled microbial communities, is computationally efficient, and up to 1000 times faster than other tools. SUPER-FOCUS is freely available at http://edwards.sdsu.edu/SUPERFOCUS .

  14. Integrated solutions for urban runoff pollution control in Brazilian metropolitan regions.

    PubMed

    Morihama, A C D; Amaro, C; Tominaga, E N S; Yazaki, L F O L; Pereira, M C S; Porto, M F A; Mukai, P; Lucci, R M

    2012-01-01

    One of the most important causes for poor water quality in urban rivers in Brazil is the low collection efficiency of the sewer system due to unforeseen interconnections with the stormwater drainage system. Since the beginning of the 20th century, Brazilian cities have adopted separate systems for sanitary sewers and stormwater runoff. Gradually these two systems became interconnected. A major challenge faced today by water managers in Brazil is to find efficient and low cost solutions to deal with this mixed system. The current situation poses an important threat to the improvement of the water quality in urban rivers and lakes. This article presents an evaluation of the water quality parameters and the diffuse pollution loads during rain events in the Pinheiros River, a tributary of the Tietê River in São Paulo. It also presents different types of integrated solutions for reducing the pollution impact of combined systems, based on the European experience in urban water management. An evaluation of their performance and a comparison with the separate system used in most Brazilian cities is also presented. The study is based on an extensive water quality monitoring program that was developed for a special investigation in the Pinheiros River and lasted 2.5 years. Samples were collected on a daily basis and water quality variables were analyzed on a daily, weekly or monthly basis. Two hundred water quality variables were monitored at 53 sampling points. During rain events, additional monitoring was carried out using an automated sampler. Pinheiros River is one of the most important rivers in the São Paulo Metropolitan Region and it is also a heavily polluted one.

  15. Self-assembled three-dimensional reduced graphene oxide-based hydrogel for highly efficient and facile removal of pharmaceutical compounds from aqueous solution.

    PubMed

    Umbreen, Nadia; Sohni, Saima; Ahmad, Imtiaz; Khattak, Nimat Ullah; Gul, Kashif

    2018-05-14

    Herein, self-assembled three-dimensional reduced graphene oxide (RGO)-based hydrogels were synthesized and characterized in detail. A thorough investigation on the uptake of three widely used pharmaceutical drugs, viz. Naproxen (NPX), Ibuprofen (IBP) and Diclofenac (DFC) was carried out from aqueous solutions. To ensure the sustainability of developed hydrogel assembly, practically important parameters such as desorption, recyclability and applicability to real samples were also evaluated. Using the developed 3D hydrogels as adsorptive platforms, excellent decontamination for the above mentioned persistent pharmaceutical drugs was achieved in acidic pH with a removal efficiency in the range of 70-80%. These hydrogels showed fast adsorption kinetics and experimental findings were fitted to different kinetic models, such as pseudo-first order, pseudo-second order, intra-particle and the Elovich models in an attempt to better understand the adsorption kinetics. Furthermore, equilibrium adsorption data was fitted to the Langmuir and Freundlich models, where relatively higher R 2 values obtained in case of former one suggested that monolayer adsorption played an important part in drug uptake. Thermodynamic aspects were also studied and negative ΔG 0 values obtained indicated the spontaneous nature of adsorption process. The study was also extended to check practical utility of as-prepared hydrogels by spiking real aqueous samples with drug solution, where high % recoveries obtained for NPX, IBP and DFC were of particular importance with regard to prospective application in wastewater treatment systems. We advocate RGO-based hydrogels as environmentally benign, readily recoverable/recyclable material with excellent adsorption capacity for application in wastewater purification. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less

  17. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE PAGES

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...

    2016-06-22

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less

  18. Technical efficiency and productivity of Chinese county hospitals: an exploratory study in Henan province, China.

    PubMed

    Cheng, Zhaohui; Tao, Hongbing; Cai, Miao; Lin, Haifeng; Lin, Xiaojun; Shu, Qin; Zhang, Ru-Ning

    2015-09-09

    Chinese county hospitals have been excessively enlarging their scale during the healthcare reform since 2009. The purpose of this paper is to examine the technical efficiency and productivity of county hospitals during the reform process, and to determine whether, and how, efficiency is affected by various factors. 114 sample county hospitals were selected from Henan province, China, from 2010 to 2012. Data envelopment analysis was employed to estimate the technical and scale efficiency of sample hospitals. The Malmquist index was used to calculate productivity changes over time. Tobit regression was used to regress against 4 environmental factors and 5 institutional factors that affected the technical efficiency. (1) 112 (98.2%), 112 (98.2%) and 104 (91.2%) of the 114 sample hospitals ran inefficiently in 2010, 2011 and 2012, with average technical efficiency of 0.697, 0.748 and 0.790, respectively. (2) On average, during 2010-2012, productivity of sample county hospitals increased by 7.8%, which was produced by the progress in technical efficiency changes and technological changes of 0.9% and 6.8%, respectively. (3) Tobit regression analysis indicated that government subsidy, hospital size with above 618 beds and average length of stay assumed a negative sign with technical efficiency; bed occupancy rate, ratio of beds to nurses and ratio of nurses to physicians assumed a positive sign with technical efficiency. There was considerable space for technical efficiency improvement in Henan county hospitals. During 2010-2012, sample hospitals experienced productivity progress; however, the adverse change in pure technical efficiency should be emphasised. Moreover, according to the Tobit results, policy interventions that strictly supervise hospital bed scale, shorten the average length of stay and coordinate the proportion among physicians, nurses and beds, would benefit hospital efficiency. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Technical efficiency and productivity of Chinese county hospitals: an exploratory study in Henan province, China

    PubMed Central

    Cheng, Zhaohui; Tao, Hongbing; Cai, Miao; Lin, Haifeng; Lin, Xiaojun; Shu, Qin; Zhang, Ru-ning

    2015-01-01

    Objectives Chinese county hospitals have been excessively enlarging their scale during the healthcare reform since 2009. The purpose of this paper is to examine the technical efficiency and productivity of county hospitals during the reform process, and to determine whether, and how, efficiency is affected by various factors. Setting and participants 114 sample county hospitals were selected from Henan province, China, from 2010 to 2012. Outcome measures Data envelopment analysis was employed to estimate the technical and scale efficiency of sample hospitals. The Malmquist index was used to calculate productivity changes over time. Tobit regression was used to regress against 4 environmental factors and 5 institutional factors that affected the technical efficiency. Results (1) 112 (98.2%), 112 (98.2%) and 104 (91.2%) of the 114 sample hospitals ran inefficiently in 2010, 2011 and 2012, with average technical efficiency of 0.697, 0.748 and 0.790, respectively. (2) On average, during 2010–2012, productivity of sample county hospitals increased by 7.8%, which was produced by the progress in technical efficiency changes and technological changes of 0.9% and 6.8%, respectively. (3) Tobit regression analysis indicated that government subsidy, hospital size with above 618 beds and average length of stay assumed a negative sign with technical efficiency; bed occupancy rate, ratio of beds to nurses and ratio of nurses to physicians assumed a positive sign with technical efficiency. Conclusions There was considerable space for technical efficiency improvement in Henan county hospitals. During 2010–2012, sample hospitals experienced productivity progress; however, the adverse change in pure technical efficiency should be emphasised. Moreover, according to the Tobit results, policy interventions that strictly supervise hospital bed scale, shorten the average length of stay and coordinate the proportion among physicians, nurses and beds, would benefit hospital efficiency. PMID:26353864

  20. Improving and Evaluating Nested Sampling Algorithm for Marginal Likelihood Estimation

    NASA Astrophysics Data System (ADS)

    Ye, M.; Zeng, X.; Wu, J.; Wang, D.; Liu, J.

    2016-12-01

    With the growing impacts of climate change and human activities on the cycle of water resources, an increasing number of researches focus on the quantification of modeling uncertainty. Bayesian model averaging (BMA) provides a popular framework for quantifying conceptual model and parameter uncertainty. The ensemble prediction is generated by combining each plausible model's prediction, and each model is attached with a model weight which is determined by model's prior weight and marginal likelihood. Thus, the estimation of model's marginal likelihood is crucial for reliable and accurate BMA prediction. Nested sampling estimator (NSE) is a new proposed method for marginal likelihood estimation. The process of NSE is accomplished by searching the parameters' space from low likelihood area to high likelihood area gradually, and this evolution is finished iteratively via local sampling procedure. Thus, the efficiency of NSE is dominated by the strength of local sampling procedure. Currently, Metropolis-Hasting (M-H) algorithm is often used for local sampling. However, M-H is not an efficient sampling algorithm for high-dimensional or complicated parameter space. For improving the efficiency of NSE, it could be ideal to incorporate the robust and efficient sampling algorithm - DREAMzs into the local sampling of NSE. The comparison results demonstrated that the improved NSE could improve the efficiency of marginal likelihood estimation significantly. However, both improved and original NSEs suffer from heavy instability. In addition, the heavy computation cost of huge number of model executions is overcome by using an adaptive sparse grid surrogates.

  1. Deposition and immersion-mode nucleation of ice by three distinct samples of volcanic ash

    NASA Astrophysics Data System (ADS)

    Schill, G. P.; Genareau, K.; Tolbert, M. A.

    2015-07-01

    Ice nucleation of volcanic ash controls both ash aggregation and cloud glaciation, which affect atmospheric transport and global climate. Previously, it has been suggested that there is one characteristic ice nucleation efficiency for all volcanic ash, regardless of its composition, when accounting for surface area; however, this claim is derived from data from only two volcanic eruptions. In this work, we have studied the depositional and immersion freezing efficiency of three distinct samples of volcanic ash using Raman microscopy coupled to an environmental cell. Ash from the Fuego (basaltic ash, Guatemala), Soufrière Hills (andesitic ash, Montserrat), and Taupo (Oruanui eruption, rhyolitic ash, New Zealand) volcanoes were chosen to represent different geographical locations and silica content. All ash samples were quantitatively analyzed for both percent crystallinity and mineralogy using X-ray diffraction. In the present study, we find that all three samples of volcanic ash are excellent depositional ice nuclei, nucleating ice from 225 to 235 K at ice saturation ratios of 1.05 ± 0.01, comparable to the mineral dust proxy kaolinite. Since depositional ice nucleation will be more important at colder temperatures, fine volcanic ash may represent a global source of cold-cloud ice nuclei. For immersion freezing relevant to mixed-phase clouds, however, only the Oruanui ash exhibited appreciable heterogeneous ice nucleation activity. Similar to recent studies on mineral dust, we suggest that the mineralogy of volcanic ash may dictate its ice nucleation activity in the immersion mode.

  2. Efficient alignment-free DNA barcode analytics.

    PubMed

    Kuksa, Pavel; Pavlovic, Vladimir

    2009-11-10

    In this work we consider barcode DNA analysis problems and address them using alternative, alignment-free methods and representations which model sequences as collections of short sequence fragments (features). The methods use fixed-length representations (spectrum) for barcode sequences to measure similarities or dissimilarities between sequences coming from the same or different species. The spectrum-based representation not only allows for accurate and computationally efficient species classification, but also opens possibility for accurate clustering analysis of putative species barcodes and identification of critical within-barcode loci distinguishing barcodes of different sample groups. New alignment-free methods provide highly accurate and fast DNA barcode-based identification and classification of species with substantial improvements in accuracy and speed over state-of-the-art barcode analysis methods. We evaluate our methods on problems of species classification and identification using barcodes, important and relevant analytical tasks in many practical applications (adverse species movement monitoring, sampling surveys for unknown or pathogenic species identification, biodiversity assessment, etc.) On several benchmark barcode datasets, including ACG, Astraptes, Hesperiidae, Fish larvae, and Birds of North America, proposed alignment-free methods considerably improve prediction accuracy compared to prior results. We also observe significant running time improvements over the state-of-the-art methods. Our results show that newly developed alignment-free methods for DNA barcoding can efficiently and with high accuracy identify specimens by examining only few barcode features, resulting in increased scalability and interpretability of current computational approaches to barcoding.

  3. Continuous separation of breast cancer cells from blood samples using multi-orifice flow fractionation (MOFF) and dielectrophoresis (DEP).

    PubMed

    Moon, Hui-Sung; Kwon, Kiho; Kim, Seung-Il; Han, Hyunju; Sohn, Joohyuk; Lee, Soohyeon; Jung, Hyo-Il

    2011-03-21

    Circulating tumor cells (CTCs) are highly correlated with the invasive behavior of cancer, so their isolations and quantifications are important for biomedical applications such as cancer prognosis and measuring the responses to drug treatments. In this paper, we present the development of a microfluidic device for the separation of CTCs from blood cells based on the physical properties of cells. For use as a CTC model, we successfully separated human breast cancer cells (MCF-7) from a spiked blood cell sample by combining multi-orifice flow fractionation (MOFF) and dielectrophoretic (DEP) cell separation technique. Hydrodynamic separation takes advantage of the massive and high-throughput filtration of blood cells as it can accommodate a very high flow rate. DEP separation plays a role in precise post-processing to enhance the efficiency of the separation. The serial combination of these two different sorting techniques enabled high-speed continuous flow-through separation without labeling. We observed up to a 162-fold increase in MCF-7 cells at a 126 µL min(-1) flow rate. Red and white blood cells were efficiently removed with separation efficiencies of 99.24% and 94.23% respectively. Therefore, we suggest that our system could be used for separation and detection of CTCs from blood cells for biomedical applications. This journal is © The Royal Society of Chemistry 2011

  4. Long-term Calibration Considerations during Subcutaneous Microdialysis Sampling in Mobile Rats

    PubMed Central

    Mou, Xiaodun; Lennartz, Michelle; Loegering, Daniel J.; Stenken, Julie A.

    2010-01-01

    The level at which implanted sensors and sampling devices maintain their calibration is an important research area. In this work, microdialysis probes with identical geometry and different membranes, polycarbonate/polyether (PC) or polyethersulfone (PES), were used with internal standards (vitamin B12 (MW 1355), antipyrine (MW 188) and 2-deoxyglucose (2-DG, MW 164)) and endogenous glucose to investigate changes in their long-term calibration after implantation into the subcutaneous space of Sprague-Dawley rats. Histological analysis confirmed an inflammatory response to the microdialysis probes and the presence of a collagen capsule. The membrane extraction efficiency (percentage delivered to the tissue space) for antipyrine and 2-DG was not altered throughout the implant lifetime for either PC- or PES-membranes. Yet, Vitamin B12 extraction efficiency and collected glucose concentrations decreased during the implant lifetime. Antipyrine was administered i.v. and its concentrations obtained in both PC-and PES-membrane probes were significantly reduced between the implant day and seven (PC) or 10 (PES) days post implantation suggesting that solute supply is critical for in vivo extraction efficiency. For the low molecular weight solutes such as antipyrine and glucose, localized delivery is not affected by the foreign body reaction, but recovery is significantly reduced. For Vitamin B12, a larger solute, the fibrotic capsule formed around the probe significantly restricts diffusion from the implanted microdialysis probes. PMID:20223515

  5. The influence of seine capture efficiency on fish abundance estimates in the upper Mississippi River

    USGS Publications Warehouse

    Holland Bartels, L. E.; Dewey, M.R.

    1997-01-01

    The effects of season, presence of vegetation, and time of day on seine capture efficiency for fish were evaluated using test enclosures in the upper Mississippi River. Overall capture efficiency of the seine haul was 49% (53% during the day and 43% at night). During daytime tests, the efficiency ranged from 39% to 74% but did not differ statistically between sites or among dates. At night, the efficiency was higher at the vegetated than at the nonvegetated site (55% vs 32%) and declined through time from 56% in May to 28% in October. Although susceptibility to capture differed among taxa, we could not predict either total catch efficiency or efficiency within a given taxon for a given sample. Adjustment of catch data with various estimates of efficiency reduced the mean absolute error for all sampling dates from 51% to 24%, but the error of the adjusted data still ranged from -58% to +54% on any given sampling date. These results indicate that it is difficult to make accurate adjustment of catch data to compensate for gear bias in studies of seasonal habitat use.

  6. Prevalence of Toxoplasma gondii in Chicken samples from delta of Egypt using ELISA, histopathology and immunohistochemistry.

    PubMed

    Ibrahim, Hany M; Abdel-Ghaffar, Fathy; Osman, Gamalat Y; El-Shourbagy, Safinaz H; Nishikawa, Yoshifumi; Khattab, Reham A

    2016-06-01

    Estimates of the zoonotic diseases are helpful for monitoring and improving public health. Laboratory-based surveillance provides crucial information for assessing zoonotic disease trends and developments. Toxoplasmosis is considered as a zoonotic disease and has both medical and veterinary importance since it leads to abortion in humans and several animal species. In view of the worldwide importance of T. gondii, this study aimed to estimate the prevalence of T. gondii in chickens from the Delta of Egypt. A total of 304 blood and brain samples were collected from Egyptian chickens from Gharbiya, Qalyoubiya, Minufiya, Beheira, Kafr EL-Shaykh and Dakahlia Provinces. In order to determine the serological and histopathological prevalence of T. gondii, the samples were examined by ELISA, histopathology and immunohistochemistry (IHC). The prevalence of T. gondii was 11.18, 6.91, 6.91 % by ELISA, histopathology and IHC, respectively. Statistically significant differences in the prevalence of T. gondii were observed on the basis of season, sex and habitat. These data provide valuable information regarding the epidemiology of T. gondii infections in Egyptian chickens, which can be employed in developing efficient strategies for disease management and control.

  7. Memory-efficient dynamic programming backtrace and pairwise local sequence alignment.

    PubMed

    Newberg, Lee A

    2008-08-15

    A backtrace through a dynamic programming algorithm's intermediate results in search of an optimal path, or to sample paths according to an implied probability distribution, or as the second stage of a forward-backward algorithm, is a task of fundamental importance in computational biology. When there is insufficient space to store all intermediate results in high-speed memory (e.g. cache) existing approaches store selected stages of the computation, and recompute missing values from these checkpoints on an as-needed basis. Here we present an optimal checkpointing strategy, and demonstrate its utility with pairwise local sequence alignment of sequences of length 10,000. Sample C++-code for optimal backtrace is available in the Supplementary Materials. Supplementary data is available at Bioinformatics online.

  8. Development of mass spectrometric techniques applicable to the search for organic matter in the lunar crust

    NASA Technical Reports Server (NTRS)

    Biemann, K.

    1973-01-01

    Data processing techniques were developed to measure with high precision and sensitivity the line spectra produced by a high resolution mass spectrometer. The most important aspect of this phase was the interfacing of a modified precision microphotometer-comparator with a computer and the improvement of existing software to serve the special needs of the investigation of lunar samples. In addition, a gas-chromatograph mass spectrometer system was interfaced with the same computer to allow continuous recording of mass spectra on a gas chromatographic effluent and efficient evaluation of the resulting data. These techniques were then used to detect and identify organic compounds present in the samples returned by the Apollo 11 and 12 missions.

  9. Transmission efficiency of the sigma virus in natural populations of its host, Drosophila melanogaster.

    PubMed

    Fleuriet, A

    1982-01-01

    A study of the viral samples collected in French natural populations of Drosophila melanogaster since 1969, indicates that natural populations include, as expected, both stabilized and non stabilized infected individuals. In agreement with previous observations made on other characters of the virus, the viral samples collected appear to be homogeneous for the efficiency of the hereditary transmission. However, this efficiency is greater than the average value observed with virus perpetuated in infected laboratory fly strains. One sample collected in Gabon and three in the U.S.A. appear to differ from the French samples for one at least of the traits studied in these experiments.

  10. Research on the self-absorption corrections for PGNAA of large samples

    NASA Astrophysics Data System (ADS)

    Yang, Jian-Bo; Liu, Zhi; Chang, Kang; Li, Rui

    2017-02-01

    When a large sample is analysed with the prompt gamma neutron activation analysis (PGNAA) neutron self-shielding and gamma self-absorption affect the accuracy, the correction method for the detection efficiency of the relative H of each element in a large sample is described. The influences of the thickness and density of the cement samples on the H detection efficiency, as well as the impurities Fe2O3 and SiO2 on the prompt γ ray yield for each element in the cement samples, were studied. The phase functions for Ca, Fe, and Si on H with changes in sample thickness and density were provided to avoid complicated procedures for preparing the corresponding density or thickness scale for measuring samples under each density or thickness value and to present a simplified method for the measurement efficiency scale for prompt-gamma neutron activation analysis.

  11. Molecular analysis of bacterial communities in raw cow milk and the impact of refrigeration on its structure and dynamics.

    PubMed

    Raats, Dina; Offek, Maya; Minz, Dror; Halpern, Malka

    2011-05-01

    The impact of refrigeration on raw cow milk bacterial communities in three farm bulk tanks and three dairy plant silo tanks was studied using two methods: DGGE and cloning. Both methods demonstrated that bacterial taxonomic diversity decreased during refrigeration. Gammaproteobacteria, especially Pseudomonadales, dominated the milk after refrigeration. Farm samples and dairy plant samples differed in their microbial community composition, the former showing prevalence of Gram-positive bacteria affiliated with the classes Bacilli, Clostridia and Actinobacteria, the latter showing prevalence of Gram-negative species belonging to the Gammaproteobacteria class. Actinobacteria prevalence in the farm milk samples immediately after collection stood at about 25% of the clones. A previous study had found that psychrotolerant Actinobacteria identified in raw cow milk demonstrated both lipolytic and proteolytic enzymatic activity. Thus, we conclude that although Pseudomonadales play an important role in milk spoilage after long periods of cold incubation, Actinobacteria occurrence may play an important role when assessing the quality of milk arriving at the dairy plant from different farms. As new cooling technologies reduce the initial bacterial counts of milk to very low levels, more sensitive and efficient methods to evaluate the bacterial quality of raw milk are required. The present findings are an important step towards achieving this goal. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Determination of alkaloids in onion nectar by micellar electrokinetic chromatography.

    PubMed

    Carolina Soto, Verónica; Jofré, Viviana Patricia; Galmarini, Claudio Romulo; Silva, María Fernanda

    2016-07-01

    Nectar is the most important floral reward offered by plants to insects. Minor components such as alkaloid compounds in nectar affect bee foraging, with great influence in seed production. CE is an advantageous tool for the analysis of unexplored samples such as onion nectar due to the limited amounts of samples. Considering the importance of these compounds, a simultaneous determination of nicotine, theophylline, theobromine, caffeine, harmaline, piperine in onion nectar by MEKC-UV is herein reported. The extraction of alkaloid compounds in nectar was performed by SPE using a homemade miniaturized column (C18 ). Effects of several important factors affecting extraction efficiency as well as electrophoretic performance were investigated to acquire optimum conditions. Under the proposed conditions, the analytes can be separated within 15 min in a 50 cm effective length capillary (75 μm id) at a separation voltage of 20 kV in 20 mmol/L sodium tretraborate, 100 mmol/L SDS. The amount of sample requirement was reduced up to 2000 times, when compared to traditional methods, reaching limits of detection as low as 0.0153 ng/L. For the first time, this study demonstrates that there are marked qualitative and quantitative differences in nectar alkaloids between open pollinated and male sterile lines (MSLs) and also within MSLs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Annealed Importance Sampling Reversible Jump MCMC algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karagiannis, Georgios; Andrieu, Christophe

    2013-03-20

    It will soon be 20 years since reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithms have been proposed. They have significantly extended the scope of Markov chain Monte Carlo simulation methods, offering the promise to be able to routinely tackle transdimensional sampling problems, as encountered in Bayesian model selection problems for example, in a principled and flexible fashion. Their practical efficient implementation, however, still remains a challenge. A particular difficulty encountered in practice is in the choice of the dimension matching variables (both their nature and their distribution) and the reversible transformations which allow one to define the one-to-one mappingsmore » underpinning the design of these algorithms. Indeed, even seemingly sensible choices can lead to algorithms with very poor performance. The focus of this paper is the development and performance evaluation of a method, annealed importance sampling RJ-MCMC (aisRJ), which addresses this problem by mitigating the sensitivity of RJ-MCMC algorithms to the aforementioned poor design. As we shall see the algorithm can be understood as being an “exact approximation” of an idealized MCMC algorithm that would sample from the model probabilities directly in a model selection set-up. Such an idealized algorithm may have good theoretical convergence properties, but typically cannot be implemented, and our algorithms can approximate the performance of such idealized algorithms to an arbitrary degree while not introducing any bias for any degree of approximation. Our approach combines the dimension matching ideas of RJ-MCMC with annealed importance sampling and its Markov chain Monte Carlo implementation. We illustrate the performance of the algorithm with numerical simulations which indicate that, although the approach may at first appear computationally involved, it is in fact competitive.« less

  14. Scanning Electron Microscopy with Samples in an Electric Field

    PubMed Central

    Frank, Ludĕk; Hovorka, Miloš; Mikmeková, Šárka; Mikmeková, Eliška; Müllerová, Ilona; Pokorná, Zuzana

    2012-01-01

    The high negative bias of a sample in a scanning electron microscope constitutes the “cathode lens” with a strong electric field just above the sample surface. This mode offers a convenient tool for controlling the landing energy of electrons down to units or even fractions of electronvolts with only slight readjustments of the column. Moreover, the field accelerates and collimates the signal electrons to earthed detectors above and below the sample, thereby assuring high collection efficiency and high amplification of the image signal. One important feature is the ability to acquire the complete emission of the backscattered electrons, including those emitted at high angles with respect to the surface normal. The cathode lens aberrations are proportional to the landing energy of electrons so the spot size becomes nearly constant throughout the full energy scale. At low energies and with their complete angular distribution acquired, the backscattered electron images offer enhanced information about crystalline and electronic structures thanks to contrast mechanisms that are otherwise unavailable. Examples from various areas of materials science are presented.

  15. Importance sampling large deviations in nonequilibrium steady states. I.

    PubMed

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T

    2018-03-28

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  16. Importance sampling large deviations in nonequilibrium steady states. I

    NASA Astrophysics Data System (ADS)

    Ray, Ushnish; Chan, Garnet Kin-Lic; Limmer, David T.

    2018-03-01

    Large deviation functions contain information on the stability and response of systems driven into nonequilibrium steady states and in such a way are similar to free energies for systems at equilibrium. As with equilibrium free energies, evaluating large deviation functions numerically for all but the simplest systems is difficult because by construction they depend on exponentially rare events. In this first paper of a series, we evaluate different trajectory-based sampling methods capable of computing large deviation functions of time integrated observables within nonequilibrium steady states. We illustrate some convergence criteria and best practices using a number of different models, including a biased Brownian walker, a driven lattice gas, and a model of self-assembly. We show how two popular methods for sampling trajectory ensembles, transition path sampling and diffusion Monte Carlo, suffer from exponentially diverging correlations in trajectory space as a function of the bias parameter when estimating large deviation functions. Improving the efficiencies of these algorithms requires introducing guiding functions for the trajectories.

  17. Establishing an academic biobank in a resource-challenged environment.

    PubMed

    Soo, Cassandra Claire; Mukomana, Freedom; Hazelhurst, Scott; Ramsay, Michele

    2017-05-24

    Past practices of informal sample collections and spreadsheets for data and sample management fall short of best-practice models for biobanking, and are neither cost effective nor efficient to adequately serve the needs of large research studies. The biobank of the Sydney Brenner Institute for Molecular Bioscience serves as a bioresource for institutional, national and international research collaborations. It provides high-quality human biospecimens from African populations, secure data and sample curation and storage, as well as monitored sample handling and management processes, to promote both non-communicable and infectious-disease research. Best-practice guidelines have been adapted to align with a low-resource setting and have been instrumental in the development of a quality-management system, including standard operating procedures and a quality-control regimen. Here, we provide a summary of 10 important considerations for initiating and establishing an academic research biobank in a low-resource setting. These include addressing ethical, legal, technical, accreditation and/or certification concerns and financial sustainability.

  18. Establishing an academic biobank in a resource-challenged environment

    PubMed Central

    Soo, C C; Mukomana, F; Hazelhurst, S; Ramsay, M

    2018-01-01

    Past practices of informal sample collections and spreadsheets for data and sample management fall short of best-practice models for biobanking, and are neither cost effective nor efficient to adequately serve the needs of large research studies. The biobank of the Sydney Brenner Institute for Molecular Bioscience serves as a bioresource for institutional, national and international research collaborations. It provides high-quality human biospecimens from African populations, secure data and sample curation and storage, as well as monitored sample handling and management processes, to promote both non-communicable and infectious-disease research. Best-practice guidelines have been adapted to align with a low-resource setting and have been instrumental in the development of a quality-management system, including standard operating procedures and a quality-control regimen. Here, we provide a summary of 10 important considerations for initiating and establishing an academic research biobank in a low-resource setting. These include addressing ethical, legal, technical, accreditation and/or certification concerns and financial sustainability. PMID:28604319

  19. Flutriafol and pyraclostrobin residues in Brazilian green coffees.

    PubMed

    de Oliveira, Luiz Alberto Bandeira; Pacheco, Henrique Poltronieri; Scherer, Rodrigo

    2016-01-01

    The aim of this work was to monitor flutriafol and pyraclostrobin residues in Brazilian green coffees. More than 10,000 samples were analyzed. The pesticides were extracted using the QuEChERS method and analyzed by LC-MS/MS. The validated method is fast, with 5 min runs, and efficient, as precision and accuracy showed RSD no greater than 5% and recoveries within the 88-119% range. LOQ for flutriafol and pyraclostrobin were 0.005 mg/kg. The results of the analyzed samples showed that the percentage of nonconformities regarding flutriafol increased throughout the years, with over 1200 samples (11.8%). On the other hand, just 15 samples (0.15%) presented residues above 10 μg/kg for pyraclostrobin. Considering that flutriafol is a toxic and carcinogenic pesticide, as well as the increase in the number of irregularities throughout the years, it becomes important to implement public actions to assure consumer safety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Conventional and Accelerated-Solvent Extractions of Green Tea (Camellia sinensis) for Metabolomics-based Chemometrics

    PubMed Central

    Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.

    2018-01-01

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673

  1. Efficient free energy calculations by combining two complementary tempering sampling methods.

    PubMed

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun

    2017-01-14

    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  2. Evolution of efficient methods to sample lead sources, such as house dust and hand dust, in the homes of children

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Que Hee, S.S.; Peace, B.; Clark, C.S.

    Efficient sampling methods to recover lead-containing house dust and hand dust have been evolved so that sufficient lead is collected for analysis and to ensure that correlational analyses linking these two parameters to blood lead are not dependent on the efficiency of sampling. Precise collection of loose house dust from a 1-unit area (484 cmS) with a Tygon or stainless steel sampling tube connected to a portable sampling pump (1.2 to 2.5 liters/min) required repetitive sampling (three times). The Tygon tube sampling technique for loose house dust <177 m in diameter was around 72% efficient with respect to dust weightmore » and lead collection. A representative house dust contained 81% of its total weight in this fraction. A single handwipe for applied loose hand dust was not acceptably efficient or precise, and at least three wipes were necessary to achieve recoveries of >80% of the lead applied. House dusts of different particle sizes <246 m adhered equally well to hands. Analysis of lead-containing material usually required at least three digestions/decantations using hot plate or microwave techniques to allow at least 90% of the lead to be recovered. It was recommended that other investigators validate their handwiping, house dust sampling, and digestion techniques to facilitate comparison of results across studies. The final methodology for the Cincinnati longitudinal study was three sampling passes for surface dust using a stainless steel sampling tube; three microwave digestion/decantations for analysis of dust and paint; and three wipes with handwipes with one digestion/decantation for the analysis of six handwipes together.« less

  3. Efficient free energy calculations by combining two complementary tempering sampling methods

    NASA Astrophysics Data System (ADS)

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun

    2017-01-01

    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  4. Evaluation of Students' Perceptions about Efficiency of Educational Club Practices in Primary Schools

    ERIC Educational Resources Information Center

    Gelen, Ismail; Onay, Ihsan; Varol, Volkan

    2014-01-01

    The purpose of this study is to examine the efficiency of "Educational Club Practices" that has been in Elementary School program since 2005-2006, by examining the attitudes of students about "Educational Club Practices". Sample was selected in two steps. First, stratified sampling was employed and then random sampling was…

  5. A Novel Energy-Efficient Approach for Human Activity Recognition

    PubMed Central

    Zheng, Lingxiang; Wu, Dihong; Ruan, Xiaoyang; Weng, Shaolin; Tang, Biyu; Lu, Hai; Shi, Haibin

    2017-01-01

    In this paper, we propose a novel energy-efficient approach for mobile activity recognition system (ARS) to detect human activities. The proposed energy-efficient ARS, using low sampling rates, can achieve high recognition accuracy and low energy consumption. A novel classifier that integrates hierarchical support vector machine and context-based classification (HSVMCC) is presented to achieve a high accuracy of activity recognition when the sampling rate is less than the activity frequency, i.e., the Nyquist sampling theorem is not satisfied. We tested the proposed energy-efficient approach with the data collected from 20 volunteers (14 males and six females) and the average recognition accuracy of around 96.0% was achieved. Results show that using a low sampling rate of 1Hz can save 17.3% and 59.6% of energy compared with the sampling rates of 5 Hz and 50 Hz. The proposed low sampling rate approach can greatly reduce the power consumption while maintaining high activity recognition accuracy. The composition of power consumption in online ARS is also investigated in this paper. PMID:28885560

  6. Improving laboratory results turnaround time by reducing pre analytical phase.

    PubMed

    Khalifa, Mohamed; Khalid, Parwaiz

    2014-01-01

    Laboratory turnaround time is considered one of the most important indicators of work efficiency in hospitals, physicians always need timely results to take effective clinical decisions especially in the emergency department where these results can guide physicians whether to admit patients to the hospital, discharge them home or do further investigations. A retrospective data analysis study was performed to identify the effects of ER and Lab staff training on new routines for sample collection and transportation on the pre-analytical phase of turnaround time. Renal profile tests requested by the ER and performed in 2013 has been selected as a sample, and data about 7,519 tests were retrieved and analyzed to compare turnaround time intervals before and after implementing new routines. Results showed significant time reduction on "Request to Sample Collection" and "Collection to In Lab Delivery" time intervals with less significant improvement on the analytical phase of the turnaround time.

  7. Sparse Image Reconstruction on the Sphere: Analysis and Synthesis.

    PubMed

    Wallis, Christopher G R; Wiaux, Yves; McEwen, Jason D

    2017-11-01

    We develop techniques to solve ill-posed inverse problems on the sphere by sparse regularization, exploiting sparsity in both axisymmetric and directional scale-discretized wavelet space. Denoising, inpainting, and deconvolution problems and combinations thereof, are considered as examples. Inverse problems are solved in both the analysis and synthesis settings, with a number of different sampling schemes. The most effective approach is that with the most restricted solution-space, which depends on the interplay between the adopted sampling scheme, the selection of the analysis/synthesis problem, and any weighting of the l 1 norm appearing in the regularization problem. More efficient sampling schemes on the sphere improve reconstruction fidelity by restricting the solution-space and also by improving sparsity in wavelet space. We apply the technique to denoise Planck 353-GHz observations, improving the ability to extract the structure of Galactic dust emission, which is important for studying Galactic magnetism.

  8. Compressed sensing of hyperspectral images based on scrambled block Hadamard ensemble

    NASA Astrophysics Data System (ADS)

    Wang, Li; Feng, Yan

    2016-11-01

    A fast measurement matrix based on scrambled block Hadamard ensemble for compressed sensing (CS) of hyperspectral images (HSI) is investigated. The proposed measurement matrix offers several attractive features. First, the proposed measurement matrix possesses Gaussian behavior, which illustrates that the matrix is universal and requires a near-optimal number of samples for exact reconstruction. In addition, it could be easily implemented in the optical domain due to its integer-valued elements. More importantly, the measurement matrix only needs small memory for storage in the sampling process. Experimental results on HSIs reveal that the reconstruction performance of the proposed measurement matrix is comparable or better than Gaussian matrix and Bernoulli matrix using different reconstruction algorithms while consuming less computational time. The proposed matrix could be used in CS of HSI, which would save the storage memory on board, improve the sampling efficiency, and ameliorate the reconstruction quality.

  9. An Efficient Monte Carlo Method for Modeling Radiative Transfer in Protoplanetary Disks

    NASA Technical Reports Server (NTRS)

    Kim, Stacy

    2011-01-01

    Monte Carlo methods have been shown to be effective and versatile in modeling radiative transfer processes to calculate model temperature profiles for protoplanetary disks. Temperatures profiles are important for connecting physical structure to observation and for understanding the conditions for planet formation and migration. However, certain areas of the disk such as the optically thick disk interior are under-sampled, or are of particular interest such as the snow line (where water vapor condenses into ice) and the area surrounding a protoplanet. To improve the sampling, photon packets can be preferentially scattered and reemitted toward the preferred locations at the cost of weighting packet energies to conserve the average energy flux. Here I report on the weighting schemes developed, how they can be applied to various models, and how they affect simulation mechanics and results. We find that improvements in sampling do not always imply similar improvements in temperature accuracies and calculation speeds.

  10. Silk fiber for in-tube solid-phase microextraction to detect aldehydes by chemical derivatization.

    PubMed

    Wang, Xiuqin; Pan, Lei; Feng, Juanjuan; Tian, Yu; Luo, Chuannan; Sun, Min

    2017-11-03

    Aldehydes are the potentially damaging pollutants in the environment, but it is difficult to be determined due to the low concentration level. Therefore, to accurate analysis of aldehydes, it is important for efficient sample preparation with selective enrichment and rapid separation. Environmentally friendly silk fiber as adsorbent material was directly applied to develop in-tube solid-phase microextraction for analyzing aqueous samples combined with high performance liquid chromatography. 2,4-Dinitrophenylhydrazine as a derivative reagent was used for chemical derivatization of aldehydes before extraction. Under optimum conditions, an online analysis method was built with the limits of detection in the range of 0.005-0.01μgL -1 and the linearity in the range of 0.03-10μgL -1 . Three aldehydes were determined in two real samples, and the relative recoveries were in the range of 95-102%. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Stochastic subset selection for learning with kernel machines.

    PubMed

    Rhinelander, Jason; Liu, Xiaoping P

    2012-06-01

    Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.

  12. Toward cost-efficient sampling methods

    NASA Astrophysics Data System (ADS)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  13. Use of immunomagnetic separation for the detection of Desulfovibrio vulgaris from environmental samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, R.; Hazen, T.C.; Joyner, D.C.

    2011-04-15

    Immunomagnetic separation (IMS) has proved highly efficient for recovering microorganisms from heterogeneous samples. Current investigation targeted the separation of viable cells of the sulfate-reducing bacterium, Desulfovibrio vulgaris. Streptavidin-coupled paramagnetic beads and biotin labeled antibodies raised against surface antigens of this microorganism were used to capture D. vulgaris cells in both bioreactor grown laboratory samples and from extremely low-biomass environmental soil and subsurface drilling samples. Initial studies on detection, recovery efficiency and viability for IMS were performed with laboratory grown D. vulgaris cells using various cell densities. Efficiency of cell isolation and recovery (i.e., release of the microbial cells from themore » beads following separation) was followed by microscopic imaging and acridine orange direct counts (AODC). Excellent recovery efficiency encouraged the use of IMS to capture Desulfovibrio spp. cells from low-biomass environmental samples. The environmental samples were obtained from a radionuclide-contaminated site in Germany and the chromium (VI)-contaminated Hanford site, an ongoing bioremediation project of the U.S. Department of Energy. Field deployable IMS technology may greatly facilitate environmental sampling and bioremediation process monitoring and enable transcriptomics and proteomics/metabolomics-based studies directly on cells collected from the field.« less

  14. An environmentally-friendly, highly efficient, gas pressure-assisted sample introduction system for ICP-MS and its application to detection of cadmium and lead in human plasma.

    PubMed

    Cao, Yupin; Deng, Biyang; Yan, Lizhen; Huang, Hongli

    2017-05-15

    An environmentally friendly and highly efficient gas pressure-assisted sample introduction system (GPASIS) was developed for inductively-coupled plasma mass spectrometry. A GPASIS consisting of a gas-pressure control device, a customized nebulizer, and a custom-made spray chamber was fabricated. The advantages of this GPASIS derive from its high nebulization efficiencies, small sample volume requirements, low memory effects, good precision, and zero waste emission. A GPASIS can continuously, and stably, nebulize 10% NaCl solution for more than an hour without clogging. Sensitivity, detection limits, precision, long-term stability, double charge and oxide ion levels, nebulization efficiencies, and matrix effects of the sample introduction system were evaluated. Experimental results indicated that the performance of this GPASIS, was equivalent to, or better than, those obtained by conventional sample introduction systems. This GPASIS was successfully used to determine Cd and Pb by ICP-MS in human plasma. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Molecular dynamics simulations using temperature-enhanced essential dynamics replica exchange.

    PubMed

    Kubitzki, Marcus B; de Groot, Bert L

    2007-06-15

    Today's standard molecular dynamics simulations of moderately sized biomolecular systems at full atomic resolution are typically limited to the nanosecond timescale and therefore suffer from limited conformational sampling. Efficient ensemble-preserving algorithms like replica exchange (REX) may alleviate this problem somewhat but are still computationally prohibitive due to the large number of degrees of freedom involved. Aiming at increased sampling efficiency, we present a novel simulation method combining the ideas of essential dynamics and REX. Unlike standard REX, in each replica only a selection of essential collective modes of a subsystem of interest (essential subspace) is coupled to a higher temperature, with the remainder of the system staying at a reference temperature, T(0). This selective excitation along with the replica framework permits efficient approximate ensemble-preserving conformational sampling and allows much larger temperature differences between replicas, thereby considerably enhancing sampling efficiency. Ensemble properties and sampling performance of the method are discussed using dialanine and guanylin test systems, with multi-microsecond molecular dynamics simulations of these test systems serving as references.

  16. Atomic Oxygen and Space Environment Effects on Aerospace Materials Flown with EOIM-3 Experiment

    NASA Technical Reports Server (NTRS)

    Scialdone, John J.; Clatterbuck, Carroll H.; Ayres-Treusdell, Mary; Park, Gloria; Kolos, Diane

    1996-01-01

    Polymer materials samples mounted on a passive carrier tray were flown aboard the STS-46 Atlantis shuttle as complement to the EOIM-3 (Evaluation of Oxygen Interaction with Materials) experiment to evaluate the effects of atomic oxygen on the materials and to measure the gaseous shuttle bay environment. The morphological changes of the samples produced by the atomic oxygen fluence of 2.07 x 10(exp 20) atoms/cm(exp 2) are being reported. The changes have been verified using Electron Spectroscopy for Chemical Analysis (ESCA), gravimetric measurement, microscopic observations and thermo-optical measurements. The samples, including Kapton, Delrin, epoxies, Beta Cloth, Chemglaze Z306, silver Teflon, silicone coatings, 3M tape and Uralane and Ultem, PEEK, Victrex (PES), Polyethersulfone and Polymethylpentene thermoplastic, have been characterized by their oxygen reaction efficiency on the basis of their erosion losses and the oxygen fluence. Those efficiencies have been compared to results from other experiments, when available. The efficiencies of the samples are all in the range of E-24 g/atom. The results indicate that the reaction efficiencies of the reported materials can be grouped in about three ranges of values. The least affected materials which have efficiencies varying from 1 to 10(exp 25) g/atom, include silicones, epoxies, Uralane and Teflon. A second group with efficiency from 10 to 45(exp 25) g/atom includes additional silicone coatings, the Chemglaze Z306 paint and Kapton. The third range from 50 to 75(exp 25) includes organic compound such as Pentene, Peek, Ultem, Sulfone and a 3M tape. A Delrin sample had the highest reaction efficiency of 179(exp 25) g/atom. Two samples, the aluminum Beta cloth X389-7 and the epoxy fiberglass G-11 nonflame retardant, showed a slight mass increase.

  17. Optimization of Sample Points for Monitoring Arable Land Quality by Simulated Annealing while Considering Spatial Variations

    PubMed Central

    Wang, Junxiao; Wang, Xiaorui; Zhou, Shenglu; Wu, Shaohua; Zhu, Yan; Lu, Chunfeng

    2016-01-01

    With China’s rapid economic development, the reduction in arable land has emerged as one of the most prominent problems in the nation. The long-term dynamic monitoring of arable land quality is important for protecting arable land resources. An efficient practice is to select optimal sample points while obtaining accurate predictions. To this end, the selection of effective points from a dense set of soil sample points is an urgent problem. In this study, data were collected from Donghai County, Jiangsu Province, China. The number and layout of soil sample points are optimized by considering the spatial variations in soil properties and by using an improved simulated annealing (SA) algorithm. The conclusions are as follows: (1) Optimization results in the retention of more sample points in the moderate- and high-variation partitions of the study area; (2) The number of optimal sample points obtained with the improved SA algorithm is markedly reduced, while the accuracy of the predicted soil properties is improved by approximately 5% compared with the raw data; (3) With regard to the monitoring of arable land quality, a dense distribution of sample points is needed to monitor the granularity. PMID:27706051

  18. [Method for concentration determination of mineral-oil fog in the air of workplace].

    PubMed

    Xu, Min; Zhang, Yu-Zeng; Liu, Shi-Feng

    2008-05-01

    To study the method of concentration determination of mineral-oil fog in the air of workplace. Four filter films such as synthetic fabric filter film, beta glass fiber filter film, chronic filter paper and microporous film were used in this study. Two kinds of dust samplers were used to collect the sample, one sampling at fast flow rate in a short time and the other sampling at slow flow rate with long duration. Subsequently, the filter membrane was weighed with electronic analytical balance. According to sampling efficiency and incremental size, the adsorbent ability of four different filter membranes was compared. When the flow rate was between 10 approximately 20 L/min and the sampling time was between 10 approximately 15 min, the average sampling efficiency of synthetic fabric filter film was 95.61% and the increased weight ranged from 0.87 to 2.60 mg. When the flow rate was between 10 approximately 20 L/min and sampling time was between 10 approximately 15 min, the average sampling efficiency of beta glass fiber filter film was 97.57% and the increased weight was 0.75 approximately 2.47 mg. When the flow rate was between 5 approximately 10 L/min and the sampling time between 10 approximately 20 min, the average sampling efficiency of chronic filter paper and microporous film was 48.94% and 63.15%, respectively and the increased weight was 0.75 approximately 2.15 mg and 0.23 approximately 0.85 mg, respectively. When the flow rate was 3.5 L/min and the sampling time was between 100 approximately 166 min, the average sampling efficiency of filter film were 94.44% and 93.45%, respectively and the average increased weight was 1.28 mg for beta glass fiber filter film and 0.78 mg for beta glass fiber filter film and synthetic fabric synthetic fabric filter film. The average sampling efficiency of chronic filter paper and microporous film were 37.65% and 88.21%, respectively. The average increased weight was 4.30 mg and 1.23 mg, respectively. Sampling with synthetic fabric filter film and beta glass fiber filter film is credible, accurate, simple and feasible for determination of the concentration of mineral-oil fog in workplaces.

  19. Identifying public expectations of genetic biobanks.

    PubMed

    Critchley, Christine; Nicol, Dianne; McWhirter, Rebekah

    2017-08-01

    Understanding public priorities for biobanks is vital for maximising utility and efficiency of genetic research and maintaining respect for donors. This research directly assessed the relative importance the public place on different expectations of biobanks. Quantitative and qualitative results from a national sample of 800 Australians revealed that the majority attributed more importance to protecting privacy and ethical conduct than maximising new healthcare benefits, which was in turn viewed as more important than obtaining specific consent, benefit sharing, collaborating and sharing data. A latent class analysis identified two distinct classes displaying different patterns of expectations. One placed higher priority on behaviours that respect the donor ( n = 623), the other on accelerating science ( n = 278). Additional expectations derived from qualitative data included the need for biobanks to be transparent and to prioritise their research focus, educate the public and address commercialisation.

  20. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,”more » has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer« less

  1. Simple, Defensible Sample Sizes Based on Cost Efficiency

    PubMed Central

    Bacchetti, Peter; McCulloch, Charles E.; Segal, Mark R.

    2009-01-01

    Summary The conventional approach of choosing sample size to provide 80% or greater power ignores the cost implications of different sample size choices. Costs, however, are often impossible for investigators and funders to ignore in actual practice. Here, we propose and justify a new approach for choosing sample size based on cost efficiency, the ratio of a study’s projected scientific and/or practical value to its total cost. By showing that a study’s projected value exhibits diminishing marginal returns as a function of increasing sample size for a wide variety of definitions of study value, we are able to develop two simple choices that can be defended as more cost efficient than any larger sample size. The first is to choose the sample size that minimizes the average cost per subject. The second is to choose sample size to minimize total cost divided by the square root of sample size. This latter method is theoretically more justifiable for innovative studies, but also performs reasonably well and has some justification in other cases. For example, if projected study value is assumed to be proportional to power at a specific alternative and total cost is a linear function of sample size, then this approach is guaranteed either to produce more than 90% power or to be more cost efficient than any sample size that does. These methods are easy to implement, based on reliable inputs, and well justified, so they should be regarded as acceptable alternatives to current conventional approaches. PMID:18482055

  2. Evaluations of the Method to Measure Black Carbon Particles Suspended in Rainwater and Snow Samples

    NASA Astrophysics Data System (ADS)

    Ohata, S.; Moteki, N.; Schwarz, J. P.; Fahey, D. W.; Kondo, Y.

    2012-12-01

    The mass concentrations and size distributions of black carbon (BC) particles in rainwater and snow are important parameters for improved understanding of the wet deposition of BC, is a key process in quantifying the impacts of BC on climate. In this study, we have evaluated a new method to measure these parameters. The approach consists of an ultrasonic nebulizer (USN) used in conjunction with a Single Particle Soot Photometer (SP2). The USN converts sample water into micron-size droplets at a constant rate and then extracts airborne BC particles by dehydrating the water droplets. The mass of individual BC particles is measured by the SP2, based on the laser-induced incandescence technique. The combination of the USN and SP2 enabled the measurement of BC particles using only small amount of sample water, typically 10 ml (Ohata et al., 2011). However, the loss of BC during the extraction process depends on their size. We determined the size-dependent extraction efficiency using polystyrene latex spheres (PSLs) with twelve different diameters between 100-1050 nm. The PSL concentrations in water were determined by the light extinction of at 532nm. The extraction efficiency of the USN showed broad maximum in the diameter range of 200-500nm, and decreased substantially at larger sizes. The extraction efficiency determined using the PSL standards agreed to within ±40% with that determined using laboratory-generated BC concentration standards. We applied this method to the analysis of rainwater collected in Tokyo and Okinawa over the East China Sea. Measured BC size distributions in all rainwater samples showed negligible contribution of the BC particles larger than 600nm to the total BC amounts. However, for BC particles in surface snow collected in Greenland and Antarctica, size distributions were sometimes shifted to much larger size ranges.

  3. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  4. Occurrence and fate of benzotriazoles UV filters in a typical residential wastewater treatment plant in Harbin, China.

    PubMed

    Zhao, Xue; Zhang, Zi-Feng; Xu, Lei; Liu, Li-Yan; Song, Wei-Wei; Zhu, Fu-Jie; Li, Yi-Fan; Ma, Wan-Li

    2017-08-01

    Benzotriazoles (BTs) UV filters are widely used as ultraviolet absorbents for our daily products, which received increasing attention in the past decades. Residential wastewater treatment plant (WWTP) is both an important sink for wastewater and a key pollution source for receiving water for these chemicals. In this study, pretreatment and gas chromatography-tandem mass spectrometry analysis method were developed to determine the occurrence and fate of 9 BTs UV filters in wastewater and sludge from the WWTP with anaerobic-oxic treatment process (A/O) and biological aerated filter treatment process (BAF). Totally, 81 wastewater samples and 11 sludge samples were collected in four seasons. In wastewater, UV-326 and UV-329 were frequently detected, while the highest mean concentrations were detected for UV-234 and UV-329. The concentrations were in the range of 85% in A/O process and 60-77% in BAF process except for UV-350, which was more difficult to remove with lower removal efficiencies of 33.3% for both A/O and BAF. All the target chemicals except for UV-320 were detected in sludge samples with the mean concentration ranging from 0.90 ng/g to 303.39 ng/g. There was no significant difference with concentrations and removal efficiency among different seasons. Higher detection frequency and concentration of BTs UV filters in downstream of the receiving water system indicated the contribution of effluent of the WWTP. Compared with other rivers, the lower concentrations in surface water in the Songhua River indicated light pollution status with of BTs UV filters. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Efficiency assessment of wastewater treatment plants: A data envelopment analysis approach integrating technical, economic, and environmental issues.

    PubMed

    Castellet, Lledó; Molinos-Senante, María

    2016-02-01

    The assessment of the efficiency of wastewater treatment plants (WWTPs) is essential to compare their performance and consequently to identify the best operational practices that can contribute to the reduction of operational costs. Previous studies have evaluated the efficiency of WWTPs using conventional data envelopment analysis (DEA) models. Most of these studies have considered the operational costs of the WWTPs as inputs, while the pollutants removed from wastewater are treated as outputs. However, they have ignored the fact that each pollutant removed by a WWTP involves a different environmental impact. To overcome this limitation, this paper evaluates for the first time the efficiency of a sample of WWTPs by applying the weighted slacks-based measure model. It is a non-radial DEA model which allows assigning weights to the inputs and outputs according their importance. Thus, the assessment carried out integrates environmental issues with the traditional "techno-economic" efficiency assessment of WWTPs. Moreover, the potential economic savings for each cost item have been quantified at a plant level. It is illustrated that the WWTPs analyzed have significant room to save staff and energy costs. Several managerial implications to help WWTPs' operators make informed decisions were drawn from the methodology and empirical application carried out. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Testing the reciprocal associations among co-worker incivility, organisational inefficiency, and work-related exhaustion: A one-year, cross-lagged study

    PubMed Central

    Viotti, Sara; Essenmacher, Lynnette; Hamblin, Lydia E.; Arnetz, Judith E.

    2018-01-01

    In spite of the considerable number of studies on co-worker incivility, knowledge on this topic needs to be further enhanced. In particular, no studies have focused on the reciprocal nature of the relationship of incivility with other important aspects of working life, i.e. employee well-being and the quality of the working process. The aim of the present study was to examine the cross-lagged associations among co-worker incivility, work-related exhaustion, and organisational efficiency in a sample of healthcare workers. Based on the conservation of resource theory, we hypothesised that those three variables affect each other reciprocally over the time. Data from a two-wave study design (with a one-year time lag) were utilised, and cross-lagged structural equation models were performed. Results confirmed that incivility and efficiency affected each other reciprocally over time. On the other hand, whereas incivility positively predicted exhaustion and exhaustion at inversely predicted organisational efficiency, the opposite paths were found to be not significant. The study suggests that efficiency is crucial for understanding incivility because it operates both as its cause and as its outcome. Interventions aimed at promoting civility and respect in the workplace may help prevent co-worker incivility, work-related exhaustion, and enhance organisational efficiency.

  7. Testing the reciprocal associations among co-worker incivility, organisational inefficiency, and work-related exhaustion: A one-year, cross-lagged study.

    PubMed

    Viotti, Sara; Essenmacher, Lynnette; Hamblin, Lydia E; Arnetz, Judith E

    2018-01-01

    In spite of the considerable number of studies on co-worker incivility, knowledge on this topic needs to be further enhanced. In particular, no studies have focused on the reciprocal nature of the relationship of incivility with other important aspects of working life, i.e. employee well-being and the quality of the working process. The aim of the present study was to examine the cross-lagged associations among co-worker incivility, work-related exhaustion, and organisational efficiency in a sample of healthcare workers. Based on the conservation of resource theory, we hypothesised that those three variables affect each other reciprocally over the time. Data from a two-wave study design (with a one-year time lag) were utilised, and cross-lagged structural equation models were performed. Results confirmed that incivility and efficiency affected each other reciprocally over time. On the other hand, whereas incivility positively predicted exhaustion and exhaustion at inversely predicted organisational efficiency, the opposite paths were found to be not significant. The study suggests that efficiency is crucial for understanding incivility because it operates both as its cause and as its outcome. Interventions aimed at promoting civility and respect in the workplace may help prevent co-worker incivility, work-related exhaustion, and enhance organisational efficiency.

  8. An efficient interpolation technique for jump proposals in reversible-jump Markov chain Monte Carlo calculations

    PubMed Central

    Farr, W. M.; Mandel, I.; Stevens, D.

    2015-01-01

    Selection among alternative theoretical models given an observed dataset is an important challenge in many areas of physics and astronomy. Reversible-jump Markov chain Monte Carlo (RJMCMC) is an extremely powerful technique for performing Bayesian model selection, but it suffers from a fundamental difficulty and it requires jumps between model parameter spaces, but cannot efficiently explore both parameter spaces at once. Thus, a naive jump between parameter spaces is unlikely to be accepted in the Markov chain Monte Carlo (MCMC) algorithm and convergence is correspondingly slow. Here, we demonstrate an interpolation technique that uses samples from single-model MCMCs to propose intermodel jumps from an approximation to the single-model posterior of the target parameter space. The interpolation technique, based on a kD-tree data structure, is adaptive and efficient in modest dimensionality. We show that our technique leads to improved convergence over naive jumps in an RJMCMC, and compare it to other proposals in the literature to improve the convergence of RJMCMCs. We also demonstrate the use of the same interpolation technique as a way to construct efficient ‘global’ proposal distributions for single-model MCMCs without prior knowledge of the structure of the posterior distribution, and discuss improvements that permit the method to be used in higher dimensional spaces efficiently. PMID:26543580

  9. Supercritical Fluid Extraction of Metal Chelate: A Review.

    PubMed

    Ding, Xin; Liu, Qinli; Hou, Xiongpo; Fang, Tao

    2017-03-04

    Supercritical fluid extraction (SFE), as a new green extraction technology, has been used in extracting various metal species. The solubilities of chelating agents and corresponding metal chelates are the key factors which influence the efficiency of SFE. Other main properties of them such as stability and selectivity are also reviewed. The extraction mechanisms of mainly used chelating agents are explained by typical examples in this paper. This is the important aspect of SFE of metal ions. Moreover, the extraction efficiencies of metal species also depend on other factors such as temperature, pressure, extraction time and matrix effect. The two main complexation methods namely in-situ and on-line chelating SFE are described in detail. As an efficient chelating agent, tributyl phosphate-nitric acid (TBP-HNO 3 ) complex attracts much attention. The SFE of metal ions, lanthanides and actinides as well as organometallic compounds are also summarized. With the proper selection of ligands, high efficient extraction of metal species can be obtained. As an efficient sample analysis method, supercritical fluid chromatography (SFC) is introduced in this paper. Recently, the extraction method combining ionic liquids (ILs) with supercritical fluid has been becoming a novel technology for treating metal ions. The kinetics related to SFE of metal species is discussed with some specific examples.

  10. Division of methods for counting helminths' eggs and the problem of efficiency of these methods.

    PubMed

    Jaromin-Gleń, Katarzyna; Kłapeć, Teresa; Łagód, Grzegorz; Karamon, Jacek; Malicki, Jacek; Skowrońska, Agata; Bieganowski, Andrzej

    2017-03-21

    From the sanitary and epidemiological aspects, information concerning the developmental forms of intestinal parasites, especially the eggs of helminths present in our environment in: water, soil, sandpits, sewage sludge, crops watered with wastewater are very important. The methods described in the relevant literature may be classified in various ways, primarily according to the methodology of the preparation of samples from environmental matrices prepared for analysis, and the sole methods of counting and chambers/instruments used for this purpose. In addition, there is a possibility to perform the classification of the research methods analyzed from the aspect of the method and time of identification of the individuals counted, or the necessity for staining them. Standard methods for identification of helminths' eggs from environmental matrices are usually characterized by low efficiency, i.e. from 30% to approximately 80%. The efficiency of the method applied may be measured in a dual way, either by using the method of internal standard or the 'Split/Spike' method. While measuring simultaneously in an examined object the efficiency of the method and the number of eggs, the 'actual' number of eggs may be calculated by multiplying the obtained value of the discovered eggs of helminths by inverse efficiency.

  11. Histopathological examination of nerve samples from pure neural leprosy patients: obtaining maximum information to improve diagnostic efficiency.

    PubMed

    Antunes, Sérgio Luiz Gomes; Chimelli, Leila; Jardim, Márcia Rodrigues; Vital, Robson Teixeira; Nery, José Augusto da Costa; Corte-Real, Suzana; Hacker, Mariana Andréa Vilas Boas; Sarno, Euzenir Nunes

    2012-03-01

    Nerve biopsy examination is an important auxiliary procedure for diagnosing pure neural leprosy (PNL). When acid-fast bacilli (AFB) are not detected in the nerve sample, the value of other nonspecific histological alterations should be considered along with pertinent clinical, electroneuromyographical and laboratory data (the detection of Mycobacterium leprae DNA with polymerase chain reaction and the detection of serum anti-phenolic glycolipid 1 antibodies) to support a possible or probable PNL diagnosis. Three hundred forty nerve samples [144 from PNL patients and 196 from patients with non-leprosy peripheral neuropathies (NLN)] were examined. Both AFB-negative and AFB-positive PNL samples had more frequent histopathological alterations (epithelioid granulomas, mononuclear infiltrates, fibrosis, perineurial and subperineurial oedema and decreased numbers of myelinated fibres) than the NLN group. Multivariate analysis revealed that independently, mononuclear infiltrate and perineurial fibrosis were more common in the PNL group and were able to correctly classify AFB-negative PNL samples. These results indicate that even in the absence of AFB, these histopathological nerve alterations may justify a PNL diagnosis when observed in conjunction with pertinent clinical, epidemiological and laboratory data.

  12. An efficient protocol for tissue sampling and DNA isolation from the stem bark of Leguminosae trees.

    PubMed

    Novaes, R M L; Rodrigues, J G; Lovato, M B

    2009-02-03

    Traditionally, molecular studies of plant species have used leaves as the source of DNA. However, sampling leaves from tall tree species can be quite difficult and expensive. We developed a sequence of procedures for using stem bark as a source of DNA from Leguminosae trees of the Atlantic Forest and the Cerrado. Leguminosae is an important species-rich family in these two highly diverse and endangered biomes. A modified CTAB protocol for DNA isolation is described, and details of the procedures for sampling and storage of the bark are given. The procedures were initially developed for three species, and then their applicability for 15 other species was evaluated. DNA of satisfactory quality was obtained from the bark of all species. The amounts of DNA obtained from leaves were slightly higher than from bark samples, while its purity was the same. Storing the bark frozen or by drying in silica gel yielded similar results. Polymerase chain reaction amplification worked for both plastid and nuclear genomes. This alternative for isolating DNA from bark samples of trees facilitates field work with these tree species.

  13. Pigment and Binder Concentrations in Modern Paint Samples Determined by IR and Raman Spectroscopy.

    PubMed

    Wiesinger, Rita; Pagnin, Laura; Anghelone, Marta; Moretto, Ligia M; Orsega, Emilio F; Schreiner, Manfred

    2018-06-18

    Knowledge of the techniques employed by artists, such as the composition of the paints, colour palette, and painting style, is of crucial importance not only to attribute works of art to the workshop or artist but also to develop strategies and measures for the conservation and restoration of the art. While much research has been devoted to investigating the composition of an artist's materials from a qualitative point of view, little effort has been made in terms of quantitative analyses. This study aims to quantify the relative concentrations of binders (acrylic and alkyd) and inorganic pigments in different paint samples by IR and Raman spectroscopies. To perform this quantitative evaluation, reference samples of known concentrations were prepared to obtain calibration plots. In a further step, the quantification method was verified by additional test samples and commercially available paint tubes. The results obtained confirm that the quantitative method developed for IR and Raman spectroscopy is able to efficiently determine different pigment and binder concentrations of paint samples with high accuracy. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. Sampling efficiency of modified 37-mm sampling cassettes using computational fluid dynamics.

    PubMed

    Anthony, T Renée; Sleeth, Darrah; Volckens, John

    2016-01-01

    In the U.S., most industrial hygiene practitioners continue to rely on the closed-face cassette (CFC) to assess worker exposures to hazardous dusts, primarily because ease of use, cost, and familiarity. However, mass concentrations measured with this classic sampler underestimate exposures to larger particles throughout the inhalable particulate mass (IPM) size range (up to aerodynamic diameters of 100 μm). To investigate whether the current 37-mm inlet cap can be redesigned to better meet the IPM sampling criterion, computational fluid dynamics (CFD) models were developed, and particle sampling efficiencies associated with various modifications to the CFC inlet cap were determined. Simulations of fluid flow (standard k-epsilon turbulent model) and particle transport (laminar trajectories, 1-116 μm) were conducted using sampling flow rates of 10 L min(-1) in slow moving air (0.2 m s(-1)) in the facing-the-wind orientation. Combinations of seven inlet shapes and three inlet diameters were evaluated as candidates to replace the current 37-mm inlet cap. For a given inlet geometry, differences in sampler efficiency between inlet diameters averaged less than 1% for particles through 100 μm, but the largest opening was found to increase the efficiency for the 116 μm particles by 14% for the flat inlet cap. A substantial reduction in sampler efficiency was identified for sampler inlets with side walls extending beyond the dimension of the external lip of the current 37-mm CFC. The inlet cap based on the 37-mm CFC dimensions with an expanded 15-mm entry provided the best agreement with facing-the-wind human aspiration efficiency. The sampler efficiency was increased with a flat entry or with a thin central lip adjacent to the new enlarged entry. This work provides a substantial body of sampling efficiency estimates as a function of particle size and inlet geometry for personal aerosol samplers.

  15. Sampling characteristics and calibration of snorkel counts to estimate stream fish populations

    USGS Publications Warehouse

    Weaver, D.; Kwak, Thomas J.; Pollock, Kenneth

    2014-01-01

    Snorkeling is a versatile technique for estimating lotic fish population characteristics; however, few investigators have evaluated its accuracy at population or assemblage levels. We evaluated the accuracy of snorkeling using prepositioned areal electrofishing (PAE) for estimating fish populations in a medium-sized Appalachian Mountain river during fall 2008 and summer 2009. Strip-transect snorkel counts were calibrated with PAE counts in identical locations among macrohabitats, fish species or taxa, and seasons. Mean snorkeling efficiency (i.e., the proportion of individuals counted from the true population) among all taxa and seasons was 14.7% (SE, 2.5%), and the highest efficiencies were for River Chub Nocomis micropogon at 21.1% (SE, 5.9%), Central Stoneroller Campostoma anomalum at 20.3% (SE, 9.6%), and darters (Percidae) at 17.1% (SE, 3.7%), whereas efficiencies were lower for shiners (Notropis spp., Cyprinella spp., Luxilus spp.) at 8.2% (SE, 2.2%) and suckers (Catostomidae) at 6.6% (SE, 3.2%). Macrohabitat type, fish taxon, or sampling season did not significantly explain variance in snorkeling efficiency. Mean snorkeling detection probability (i.e., probability of detecting at least one individual of a taxon) among fish taxa and seasons was 58.4% (SE, 6.1%). We applied the efficiencies from our calibration study to adjust snorkel counts from an intensive snorkeling survey conducted in a nearby reach. Total fish density estimates from strip-transect counts adjusted for snorkeling efficiency were 7,288 fish/ha (SE, 1,564) during summer and 15,805 fish/ha (SE, 4,947) during fall. Precision of fish density estimates is influenced by variation in snorkeling efficiency and sample size and may be increased with additional sampling effort. These results demonstrate the sampling properties and utility of snorkeling to characterize lotic fish assemblages with acceptable efficiency and detection probability, less effort, and no mortality, compared with traditional sampling methods.

  16. [Sampling optimization for tropical invertebrates: an example using dung beetles (Coleoptera: Scarabaeinae) in Venezuela].

    PubMed

    Ferrer-Paris, José Rafael; Sánchez-Mercado, Ada; Rodríguez, Jon Paul

    2013-03-01

    The development of efficient sampling protocols is an essential prerequisite to evaluate and identify priority conservation areas. There are f ew protocols for fauna inventory and monitoring in wide geographical scales for the tropics, where the complexity of communities and high biodiversity levels, make the implementation of efficient protocols more difficult. We proposed here a simple strategy to optimize the capture of dung beetles, applied to sampling with baited traps and generalizable to other sampling methods. We analyzed data from eight transects sampled between 2006-2008 withthe aim to develop an uniform sampling design, that allows to confidently estimate species richness, abundance and composition at wide geographical scales. We examined four characteristics of any sampling design that affect the effectiveness of the sampling effort: the number of traps, sampling duration, type and proportion of bait, and spatial arrangement of the traps along transects. We used species accumulation curves, rank-abundance plots, indicator species analysis, and multivariate correlograms. We captured 40 337 individuals (115 species/morphospecies of 23 genera). Most species were attracted by both dung and carrion, but two thirds had greater relative abundance in traps baited with human dung. Different aspects of the sampling design influenced each diversity attribute in different ways. To obtain reliable richness estimates, the number of traps was the most important aspect. Accurate abundance estimates were obtained when the sampling period was increased, while the spatial arrangement of traps was determinant to capture the species composition pattern. An optimum sampling strategy for accurate estimates of richness, abundance and diversity should: (1) set 50-70 traps to maximize the number of species detected, (2) get samples during 48-72 hours and set trap groups along the transect to reliably estimate species abundance, (3) set traps in groups of at least 10 traps to suitably record the local species composition, and (4) separate trap groups by a distance greater than 5-10km to avoid spatial autocorrelation. For the evaluation of other sampling protocols we recommend to, first, identify the elements of sampling design that could affect the sampled effort (the number of traps, sampling duration, type and proportion of bait) and their spatial distribution (spatial arrangement of the traps) and then, to evaluate how they affect richness, abundance and species composition estimates.

  17. Efficient stochastic approaches for sensitivity studies of an Eulerian large-scale air pollution model

    NASA Astrophysics Data System (ADS)

    Dimov, I.; Georgieva, R.; Todorov, V.; Ostromsky, Tz.

    2017-10-01

    Reliability of large-scale mathematical models is an important issue when such models are used to support decision makers. Sensitivity analysis of model outputs to variation or natural uncertainties of model inputs is crucial for improving the reliability of mathematical models. A comprehensive experimental study of Monte Carlo algorithms based on Sobol sequences for multidimensional numerical integration has been done. A comparison with Latin hypercube sampling and a particular quasi-Monte Carlo lattice rule based on generalized Fibonacci numbers has been presented. The algorithms have been successfully applied to compute global Sobol sensitivity measures corresponding to the influence of several input parameters (six chemical reactions rates and four different groups of pollutants) on the concentrations of important air pollutants. The concentration values have been generated by the Unified Danish Eulerian Model. The sensitivity study has been done for the areas of several European cities with different geographical locations. The numerical tests show that the stochastic algorithms under consideration are efficient for multidimensional integration and especially for computing small by value sensitivity indices. It is a crucial element since even small indices may be important to be estimated in order to achieve a more accurate distribution of inputs influence and a more reliable interpretation of the mathematical model results.

  18. Designing efficient surveys: spatial arrangement of sample points for detection of invasive species

    Treesearch

    Ludek Berec; John M. Kean; Rebecca Epanchin-Niell; Andrew M. Liebhold; Robert G. Haight

    2015-01-01

    Effective surveillance is critical to managing biological invasions via early detection and eradication. The efficiency of surveillance systems may be affected by the spatial arrangement of sample locations. We investigate how the spatial arrangement of sample points, ranging from random to fixed grid arrangements, affects the probability of detecting a target...

  19. Photo stratification improves northwest timber volume estimates.

    Treesearch

    Colin D. MacLean

    1972-01-01

    Data from extensive timber inventories of 12 counties in western and central Washington were analyzed to test the relative efficiency of double sampling for stratification as a means of estimating total volume. Photo and field plots, when combined in a stratified sampling design, proved about twice as efficient as simple field sampling. Although some gains were made by...

  20. Improving the sampling efficiency of Monte Carlo molecular simulations: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Leblanc, Benoit; Braunschweig, Bertrand; Toulhoat, Hervé; Lutton, Evelyne

    We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.

  1. A new method for evaluating radon and thoron alpha-activities per unit volume inside and outside various natural material samples by calculating SSNTD detection efficiencies for the emitted alpha-particles and measuring the resulting track densities.

    PubMed

    Misdaq, M A; Aitnouh, F; Khajmi, H; Ezzahery, H; Berrazzouk, S

    2001-08-01

    A Monte Carlo computer code for determining detection efficiencies of the CR-39 and LR-115 II solid-state nuclear track detectors (SSNTD) for alpha-particles emitted by the uranium and thorium series inside different natural material samples was developed. The influence of the alpha-particle initial energy on the SSNTD detection efficiencies was investigated. Radon (222Rn) and thoron (220Rn) alpha-activities per unit volume were evaluated inside and outside the natural material samples by exploiting data obtained for the detection efficiencies of the SSNTD utilized for the emitted alpha-particles, and measuring the resulting track densities. Results obtained were compared to those obtained by other methods. Radon emanation coefficients have been determined for some of the considered material samples.

  2. Design and testing of a shrouded probe for airborne aerosol sampling in a high velocity airstream

    NASA Astrophysics Data System (ADS)

    Cain, Stuart Arthur

    1997-07-01

    Tropospheric aerosols play an important role in many phenomena related to global climate and climate change and two important parameters, aerosol size distribution and concentration, have been the focus of a great deal of attention. To study these parameters it is necessary to obtain a representative sample of the ambient aerosol using an airborne aerosol sampling probe mounted on a suitably equipped aircraft. Recently, however, serious questions have been raised (Huebert et al., 1990; Baumgardner et al., 1991) concerning the current procedures and techniques used in airborne aerosol sampling. We believe that these questions can be answered by: (1) use of a shrouded aerosol sampling probe, (2) proper aerodynamic sampler design using numerical simulation techniques, (3) calculation of the sampler calibration curve to be used in determining free-stream aerosol properties from measurements made with the sampler and (4) wind tunnel tests to verify the design and investigate the performance of the sampler at small angles of attack (typical in airborne sampling applications due to wind gusts and aircraft fuel consumption). Our analysis is limited to the collection of insoluble particles representative of the global tropospheric 'background aerosol' (0.1-2.6 μm diameter) whose characteristics are least likely to be affected by the collection process. We begin with a survey of the most relevant problems associated with current airborne aerosol samplers and define the physical quantity that we wish to measure. This includes the derivation of a unique mathematical expression relating the free-stream aerosol size distribution to aerosol data obtained from the airborne measurements with the sampler. We follow with the presentation of the results of our application of Computational Fluid Dynamics (CFD) and Computational Particle Dynamics (CPD) to the design of a shrouded probe for airborne aerosol sampling of insoluble tropospheric particles in the size range 0.1 to 15 μm diameter at an altitude of 6069 m (20,000 ft) above sea level (asl). Our aircraft of choice is the National Center for Atmospheric Research (NCAR) EC-130 Geoscience Research aircraft whose cruising speed at a sampling altitude of 6069 m asl is 100 m/s. We calculate the aspiration efficiency of the sampler and estimate the transmission efficiency of the diffuser probe based on particle trajectory simulations. We conclude by presenting the results of a series of qualitative and quantitative wind tunnel tests of the airflow through a plexiglass prototype of the sampler to verify our numerical simulations and predict the performance of the sampler at angles of attack from 0o to 15o.

  3. Evaluation of Pump Pulsation in Respirable Size-Selective Sampling: Part II. Changes in Sampling Efficiency

    PubMed Central

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M.; Harper, Martin

    2015-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the DO cyclone. However, for three models of pumps producing 30%, 56%, and 70% pulsations, substantial changes were confirmed. The GK2.69 cyclone showed a similar pattern to that of the DO cyclone, i.e. no change in sampling efficiency for the Legacy producing 15% pulsation and a substantial change for the Elite12 producing 41% pulsation. Pulse shape did not cause any change in sampling efficiency when compared to the single sine wave. The findings suggest that 25% pulsation at the inlet of the cyclone as measured by this test can be acceptable for the respirable particle collection. If this test is used in place of that currently in European standards (EN 1232–1997 and EN 12919-1999) or is used in any International Organization for Standardization standard, then a 25% pulsation criterion could be adopted. This work suggests that a 10% criterion as currently specified in the European standards for testing may be overly restrictive and not able to be met by many pumps on the market. Further work is recommended to determine which criterion would be applicable to this test if it is to be retained in its current form. PMID:24064963

  4. Evaluation of pump pulsation in respirable size-selective sampling: part II. Changes in sampling efficiency.

    PubMed

    Lee, Eun Gyung; Lee, Taekhee; Kim, Seung Won; Lee, Larry; Flemmer, Michael M; Harper, Martin

    2014-01-01

    This second, and concluding, part of this study evaluated changes in sampling efficiency of respirable size-selective samplers due to air pulsations generated by the selected personal sampling pumps characterized in Part I (Lee E, Lee L, Möhlmann C et al. Evaluation of pump pulsation in respirable size-selective sampling: Part I. Pulsation measurements. Ann Occup Hyg 2013). Nine particle sizes of monodisperse ammonium fluorescein (from 1 to 9 μm mass median aerodynamic diameter) were generated individually by a vibrating orifice aerosol generator from dilute solutions of fluorescein in aqueous ammonia and then injected into an environmental chamber. To collect these particles, 10-mm nylon cyclones, also known as Dorr-Oliver (DO) cyclones, were used with five medium volumetric flow rate pumps. Those were the Apex IS, HFS513, GilAir5, Elite5, and Basic5 pumps, which were found in Part I to generate pulsations of 5% (the lowest), 25%, 30%, 56%, and 70% (the highest), respectively. GK2.69 cyclones were used with the Legacy [pump pulsation (PP) = 15%] and Elite12 (PP = 41%) pumps for collection at high flows. The DO cyclone was also used to evaluate changes in sampling efficiency due to pulse shape. The HFS513 pump, which generates a more complex pulse shape, was compared to a single sine wave fluctuation generated by a piston. The luminescent intensity of the fluorescein extracted from each sample was measured with a luminescence spectrometer. Sampling efficiencies were obtained by dividing the intensity of the fluorescein extracted from the filter placed in a cyclone with the intensity obtained from the filter used with a sharp-edged reference sampler. Then, sampling efficiency curves were generated using a sigmoid function with three parameters and each sampling efficiency curve was compared to that of the reference cyclone by constructing bias maps. In general, no change in sampling efficiency (bias under ±10%) was observed until pulsations exceeded 25% for the DO cyclone. However, for three models of pumps producing 30%, 56%, and 70% pulsations, substantial changes were confirmed. The GK2.69 cyclone showed a similar pattern to that of the DO cyclone, i.e. no change in sampling efficiency for the Legacy producing 15% pulsation and a substantial change for the Elite12 producing 41% pulsation. Pulse shape did not cause any change in sampling efficiency when compared to the single sine wave. The findings suggest that 25% pulsation at the inlet of the cyclone as measured by this test can be acceptable for the respirable particle collection. If this test is used in place of that currently in European standards (EN 1232-1997 and EN 12919-1999) or is used in any International Organization for Standardization standard, then a 25% pulsation criterion could be adopted. This work suggests that a 10% criterion as currently specified in the European standards for testing may be overly restrictive and not able to be met by many pumps on the market. Further work is recommended to determine which criterion would be applicable to this test if it is to be retained in its current form.

  5. Asymmetric flow field flow fractionation with light scattering detection - an orthogonal sensitivity analysis.

    PubMed

    Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S

    2016-11-18

    Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Validation of abundance estimates from mark–recapture and removal techniques for rainbow trout captured by electrofishing in small streams

    USGS Publications Warehouse

    Rosenberger, Amanda E.; Dunham, Jason B.

    2005-01-01

    Estimation of fish abundance in streams using the removal model or the Lincoln - Peterson mark - recapture model is a common practice in fisheries. These models produce misleading results if their assumptions are violated. We evaluated the assumptions of these two models via electrofishing of rainbow trout Oncorhynchus mykiss in central Idaho streams. For one-, two-, three-, and four-pass sampling effort in closed sites, we evaluated the influences of fish size and habitat characteristics on sampling efficiency and the accuracy of removal abundance estimates. We also examined the use of models to generate unbiased estimates of fish abundance through adjustment of total catch or biased removal estimates. Our results suggested that the assumptions of the mark - recapture model were satisfied and that abundance estimates based on this approach were unbiased. In contrast, the removal model assumptions were not met. Decreasing sampling efficiencies over removal passes resulted in underestimated population sizes and overestimates of sampling efficiency. This bias decreased, but was not eliminated, with increased sampling effort. Biased removal estimates based on different levels of effort were highly correlated with each other but were less correlated with unbiased mark - recapture estimates. Stream size decreased sampling efficiency, and stream size and instream wood increased the negative bias of removal estimates. We found that reliable estimates of population abundance could be obtained from models of sampling efficiency for different levels of effort. Validation of abundance estimates requires extra attention to routine sampling considerations but can help fisheries biologists avoid pitfalls associated with biased data and facilitate standardized comparisons among studies that employ different sampling methods.

  7. Efficiency reduction and pseudo-convergence in replica exchange sampling of peptide folding unfolding equilibria

    NASA Astrophysics Data System (ADS)

    Denschlag, Robert; Lingenheil, Martin; Tavan, Paul

    2008-06-01

    Replica exchange (RE) molecular dynamics (MD) simulations are frequently applied to sample the folding-unfolding equilibria of β-hairpin peptides in solution, because efficiency gains are expected from this technique. Using a three-state Markov model featuring key aspects of β-hairpin folding we show that RE simulations can be less efficient than conventional techniques. Furthermore we demonstrate that one is easily seduced to erroneously assign convergence to the RE sampling, because RE ensembles can rapidly reach long-lived stationary states. We conclude that typical REMD simulations covering a few tens of nanoseconds are by far too short for sufficient sampling of β-hairpin folding-unfolding equilibria.

  8. Mass absorption efficiency of elemental carbon over Van Vihar National Park, Bhopal, India: Temporal variability and implications to estimates of black carbon radiative forcing

    NASA Astrophysics Data System (ADS)

    Samiksha, S.; Raman, R. S.; Singh, A.

    2016-12-01

    It is now well recognized that black carbon (a component of aerosols that is similar but not identical to elemental carbon) is an important contributor to global warming, second only to CO2.However, the most popular methods for estimation of black carbon rely on accurate estimates of its mass absorption efficiency (MAE) to convert optical attenuation measurements to black carbon concentrations. Often a constant manufacturer specified MAE is used for this purposes. Recent literature has unequivocally established that MAE shows large spatio-temporal heterogeneities. This is so because MAE depends on emission sources, chemical composition, and mixing state of aerosols. In this study, ambient PM2.5 samples were collected over an ecologically sensitive zone (Van Vihar National Park) in Bhopal, Central India for two years (01 January, 2012 to 31 December, 2013). Samples were collected on Teflon, Nylon, and Tissue quartz filter substrates. Punches of quartz fibre filter were analysed for organic and elemental carbon (OC/EC) by a thermal-optical-transmittance/reflectance (TOT-TOR) analyser operating with a 632 nm laser diode. Teflon filters were also used to interdependently measure PM2.5 attenuation (at 370 nm and 800 nm) by transmissometry. Site-specific mass absorption efficiency (MAE) for elemental carbon over the study site will be derived using a combination of measurements from the TOT/TOR analyser and transmissometer. An assessment of site-specific MAE values, its temporal variability and implications to black carbon radiative forcing will be discussed. It is now well recognized that black carbon (a component of aerosols that is similar but not identical to elemental carbon) is an important contributor to global warming, second only to CO2. However, the most popular methods for estimation of black carbon rely on accurate estimates of its mass absorption efficiency (MAE) to convert optical attenuation measurements to black carbon concentrations. Often a constant manufacturer specified MAE is used for this purposes. Recent literature has unequivocally established that MAE shows large spatio-temporal heterogeneities. This is so because MAE depends on emission sources, chemical composition, and mixing state of aerosols. In this study, ambient PM2.5 samples were collected over an ecologically sensitive zone (Van Vihar National Park) in Bhopal, Central India for two years (01 January, 2012 to 31 December, 2013). Samples were collected on Teflon, Nylon, and Tissue quartz filter substrates. Punches of quartz fibre filter were analysed for organic and elemental carbon (OC/EC) by a thermal-optical-transmittance/reflectance (TOT-TOR) analyser operating with a 632 nm laser diode. Teflon filters were also used to interdependently measure PM2.5 attenuation (at 370 nm and 800 nm) by transmissometry. Site-specific mass absorption efficiency (MAE) for elemental carbon over the study site will be derived using a combination of measurements from the TOT/TOR analyser and transmissometer. An assessment of site-specific MAE values, its temporal variability and implications to black carbon radiative forcing will be discussed.

  9. Steady-state low thermal resistance characterization apparatus: The bulk thermal tester

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burg, Brian R.; Kolly, Manuel; Blasakis, Nicolas

    The reliability of microelectronic devices is largely dependent on electronic packaging, which includes heat removal. The appropriate packaging design therefore necessitates precise knowledge of the relevant material properties, including thermal resistance and thermal conductivity. Thin materials and high conductivity layers make their thermal characterization challenging. A steady state measurement technique is presented and evaluated with the purpose to characterize samples with a thermal resistance below 100 mm{sup 2} K/W. It is based on the heat flow meter bar approach made up by two copper blocks and relies exclusively on temperature measurements from thermocouples. The importance of thermocouple calibration is emphasizedmore » in order to obtain accurate temperature readings. An in depth error analysis, based on Gaussian error propagation, is carried out. An error sensitivity analysis highlights the importance of the precise knowledge of the thermal interface materials required for the measurements. Reference measurements on Mo samples reveal a measurement uncertainty in the range of 5% and most accurate measurements are obtained at high heat fluxes. Measurement techniques for homogeneous bulk samples, layered materials, and protruding cavity samples are discussed. Ultimately, a comprehensive overview of a steady state thermal characterization technique is provided, evaluating the accuracy of sample measurements with thermal resistances well below state of the art setups. Accurate characterization of materials used in heat removal applications, such as electronic packaging, will enable more efficient designs and ultimately contribute to energy savings.« less

  10. Estimating rare events in biochemical systems using conditional sampling.

    PubMed

    Sundar, V S

    2017-01-28

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  11. Cationized Magnetoferritin Enables Rapid Labeling and Concentration of Gram-Positive and Gram-Negative Bacteria in Magnetic Cell Separation Columns

    PubMed Central

    Spencer, J.; Schwarzacher, W.

    2016-01-01

    ABSTRACT In order to identify pathogens rapidly and reliably, bacterial capture and concentration from large sample volumes into smaller ones are often required. Magnetic labeling and capture of bacteria using a magnetic field hold great promise for achieving this goal, but the current protocols have poor capture efficiency. Here, we present a rapid and highly efficient approach to magnetic labeling and capture of both Gram-negative (Escherichia coli) and Gram-positive (Staphylococcus aureus) bacteria using cationized magnetoferritin (cat-MF). Magnetic labeling was achieved within a 1-min incubation period with cat-MF, and 99.97% of the labeled bacteria were immobilized in commercially available magnetic cell separation (MACS) columns. Longer incubation times led to more efficient capture, with S. aureus being immobilized to a greater extent than E. coli. Finally, low numbers of magnetically labeled E. coli bacteria (<100 CFU per ml) were immobilized with 100% efficiency and concentrated 7-fold within 15 min. Therefore, our study provides a novel protocol for rapid and highly efficient magnetic labeling, capture, and concentration of both Gram-positive and Gram-negative bacteria. IMPORTANCE Antimicrobial resistance (AMR) is a significant global challenge. Rapid identification of pathogens will retard the spread of AMR by enabling targeted treatment with suitable agents and by reducing inappropriate antimicrobial use. Rapid detection methods based on microfluidic devices require that bacteria are concentrated from large volumes into much smaller ones. Concentration of bacteria is also important to detect low numbers of pathogens with confidence. Here, we demonstrate that magnetic separation columns capture small amounts of bacteria with 100% efficiency. Rapid magnetization was achieved by exposing bacteria to cationic magnetic nanoparticles, and magnetized bacteria were concentrated 7-fold inside the column. Thus, bacterial capture and concentration were achieved within 15 min. This approach could be extended to encompass the capture and concentration of specific pathogens, for example, by functionalizing magnetic nanoparticles with antibodies or small molecule probes. PMID:27060124

  12. Shear-induced hydrodynamic cavitation as a tool for pharmaceutical micropollutants removal from urban wastewater.

    PubMed

    Zupanc, Mojca; Kosjek, Tina; Petkovšek, Martin; Dular, Matevž; Kompare, Boris; Širok, Brane; Stražar, Marjeta; Heath, Ester

    2014-05-01

    In this study, the removal of clofibric acid, ibuprofen, naproxen, ketoprofen, carbamazepine and diclofenac residues from wastewater, using a novel shear-induced cavitation generator has been systematically studied. The effects of temperature, cavitation time and H2O2 dose on removal efficiency were investigated. Optimisation (50°C; 15 min; 340 mg L(-1) of added H2O2) resulted in removal efficiencies of 47-86% in spiked deionised water samples. Treatment of actual wastewater effluents revealed that although matrix composition reduces removal efficiency, this effect can be compensated for by increasing H2O2 dose (3.4 g L(-1)) and prolonging cavitation time (30 min). Hydrodynamic cavitation has also been investigated as either a pre- or a post-treatment step to biological treatment. The results revealed a higher overall removal efficiency of recalcitrant diclofenac and carbamazepine, when hydrodynamic cavitation was used prior to as compared to post biological treatment i.e., 54% and 67% as compared to 39% and 56%, respectively. This is an important finding since diclofenac is considered as a priority substance to be included in the EU Water Framework Directive. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. FARVATX: FAmily-based Rare Variant Association Test for X-linked genes

    PubMed Central

    Choi, Sungkyoung; Lee, Sungyoung; Qiao, Dandi; Hardin, Megan; Cho, Michael H.; Silverman, Edwin K; Park, Taesung; Won, Sungho

    2016-01-01

    Although the X chromosome has many genes that are functionally related to human diseases, the complicated biological properties of the X chromosome have prevented efficient genetic association analyses, and only a few significantly associated X-linked variants have been reported for complex traits. For instance, dosage compensation of X-linked genes is often achieved via the inactivation of one allele in each X-linked variant in females; however, some X-linked variants can escape this X chromosome inactivation. Efficient genetic analyses cannot be conducted without prior knowledge about the gene expression process of X-linked variants, and misspecified information can lead to power loss. In this report, we propose new statistical methods for rare X-linked variant genetic association analysis of dichotomous phenotypes with family-based samples. The proposed methods are computationally efficient and can complete X-linked analyses within a few hours. Simulation studies demonstrate the statistical efficiency of the proposed methods, which were then applied to rare-variant association analysis of the X chromosome in chronic obstructive pulmonary disease (COPD). Some promising significant X-linked genes were identified, illustrating the practical importance of the proposed methods. PMID:27325607

  14. FARVATX: Family-Based Rare Variant Association Test for X-Linked Genes.

    PubMed

    Choi, Sungkyoung; Lee, Sungyoung; Qiao, Dandi; Hardin, Megan; Cho, Michael H; Silverman, Edwin K; Park, Taesung; Won, Sungho

    2016-09-01

    Although the X chromosome has many genes that are functionally related to human diseases, the complicated biological properties of the X chromosome have prevented efficient genetic association analyses, and only a few significantly associated X-linked variants have been reported for complex traits. For instance, dosage compensation of X-linked genes is often achieved via the inactivation of one allele in each X-linked variant in females; however, some X-linked variants can escape this X chromosome inactivation. Efficient genetic analyses cannot be conducted without prior knowledge about the gene expression process of X-linked variants, and misspecified information can lead to power loss. In this report, we propose new statistical methods for rare X-linked variant genetic association analysis of dichotomous phenotypes with family-based samples. The proposed methods are computationally efficient and can complete X-linked analyses within a few hours. Simulation studies demonstrate the statistical efficiency of the proposed methods, which were then applied to rare-variant association analysis of the X chromosome in chronic obstructive pulmonary disease. Some promising significant X-linked genes were identified, illustrating the practical importance of the proposed methods. © 2016 WILEY PERIODICALS, INC.

  15. Development of a Solar Cell Back Sheet with Excellent UV Durability and Thermal Conductivity.

    PubMed

    Kang, Seong-Hwan; Choi, Jaeho; Lee, Sung-Ho; Song, Young-Hoon; Park, Jong-Se; Jung, In-Sung; Jung, Jin-Su; Kim, Chong-Yeal; Yang, O-Bong

    2018-09-01

    The back sheet is one of the most important materials in photovoltaic (PV) modules. It plays an important role in protecting the solar cell from the environment by preventing moisture penetration. In the back sheet, the outermost layer is composed of a polyester (PET) film to protect the PV module from moisture, and the opposite layer is composed of a TiO2 + PE material. Nowadays, PV modules are installed in the desert. Therefore, methods to improve the power generation efficiency of PV modules need to be investigated as the efficiency is affected by temperature resulting from the heat radiation effect. Using a back sheet with a high thermal conductivity, the module output efficiency can be increased as heat is efficiently dissipated. In this study, a thermally conductive film was fabricated by mixing a reference film (TiO2 + PE) and a non-metallic material, MgO, with high thermal conductivity. UV irradiation tests of the film were conducted. The thermally conductive film (TiO2 + PE + MgO) showed higher conductivity than a reference film. No visible cracks and low yellowing degree were found in thermally conductive film, confirming its excellent UV durability characteristics. The sample film was bonded to a PET layer, and a back sheet was fabricated. The yellowing of the back sheet was also analyzed after UV irradiation. In addition, mini modules with four solar cell were fabricated using the developed back sheet, and a comparative outdoor test was conducted. The results showed that power generation improved by 1.38%.

  16. Microfluidics cell sample preparation for analysis: Advances in efficient cell enrichment and precise single cell capture

    PubMed Central

    Bian, Shengtai; Cheng, Yinuo; Shi, Guanya; Liu, Peng; Ye, Xiongying

    2017-01-01

    Single cell analysis has received increasing attention recently in both academia and clinics, and there is an urgent need for effective upstream cell sample preparation. Two extremely challenging tasks in cell sample preparation—high-efficiency cell enrichment and precise single cell capture—have now entered into an era full of exciting technological advances, which are mostly enabled by microfluidics. In this review, we summarize the category of technologies that provide new solutions and creative insights into the two tasks of cell manipulation, with a focus on the latest development in the recent five years by highlighting the representative works. By doing so, we aim both to outline the framework and to showcase example applications of each task. In most cases for cell enrichment, we take circulating tumor cells (CTCs) as the target cells because of their research and clinical importance in cancer. For single cell capture, we review related technologies for many kinds of target cells because the technologies are supposed to be more universal to all cells rather than CTCs. Most of the mentioned technologies can be used for both cell enrichment and precise single cell capture. Each technology has its own advantages and specific challenges, which provide opportunities for researchers in their own area. Overall, these technologies have shown great promise and now evolve into real clinical applications. PMID:28217240

  17. Occurrence and removal of phenolic endocrine disrupting chemicals in the water treatment processes

    NASA Astrophysics Data System (ADS)

    Lv, Xuemin; Xiao, Sanhua; Zhang, Gang; Jiang, Pu; Tang, Fei

    2016-03-01

    This paper evaluated the occurrence and removal efficiency of four selected phenolic endocrine disrupting chemicals (bisphenol A (BPA), octylphenol (OP), nonylphenol (NP) and diethylstilbestrol (DES)) in two drinking waterworks in Jiangsu province which take source water from Taihu Lake. The recombined yeast estrogen screen (YES) and liquid chromatography tandem mass spectrometry (LC-MS/MS) were applied to assess the estrogenicity and detect the estrogens in the samples. The estrogen equivalents (EEQs) ranged from nd (not detected) to 2.96 ng/L, and the estrogenic activities decreased along the processes. Among the 32 samples, DES prevailed in all samples, with concentrations ranging 1.46-12.0 ng/L, BPA, OP and NP were partially detected, with concentrations ranging from nd to 17.73 ng/L, nd to 0.49 ng/L and nd to 3.27 ng/L, respectively. DES was found to be the main contributor to the estrogenicity (99.06%), followed by NP (0.62%), OP (0.23%) and BPA (0.09%). From the observation of treatment efficiency, the advanced treatment processes presented much higher removal ratio in reducing DES, the biodegradation played an important role in removing BPA, ozonation and pre-oxidation showed an effective removal on all the four estrogens; while the conventional ones can also reduce all the four estrogens.

  18. The efficiency of concentration methods used to detect enteric viruses in anaerobically digested sludge

    PubMed Central

    Prado, Tatiana; Guilayn, Wilma de Carvalho Pereira Bonet; Gaspar, Ana Maria Coimbra; Miagostovich, Marize Pereira

    2013-01-01

    The presence of enteric viruses in biosolids can be underestimated due to the inefficient methods (mainly molecular methods) used to recover the viruses from these matrices. Therefore, the goal of this study was to evaluate the different methods used to recover adenoviruses (AdV), rotavirus species A (RVA), norovirus genogroup II (NoV GII) and the hepatitis A virus (HAV) from biosolid samples at a large urban wastewater treatment plant in Brazil after they had been treated by mesophilic anaerobic digestion. Quantitative polymerase chain reaction (PCR) was used for spiking experiments to compare the detection limits of feasible methods, such as beef extract elution and ultracentrifugation. Tests were performed to detect the inhibition levels and the bacteriophage PP7 was used as an internal control. The results showed that the inhibitors affected the efficiency of the PCR reaction and that beef extract elution is a suitable method for detecting enteric viruses, mainly AdV from biosolid samples. All of the viral groups were detected in the biosolid samples: AdV (90%), RVA, NoV GII (45%) and HAV (18%), indicating the viruses' resistance to the anaerobic treatment process. This is the first study in Brazil to detect the presence of RVA, AdV, NoV GII and HAV in anaerobically digested sludge, highlighting the importance of adequate waste management. PMID:23440119

  19. Picoelectrospray Ionization Mass Spectrometry Using Narrow-bore Chemically Etched Emitters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marginean, Ioan; Tang, Keqi; Smith, Richard D.

    2014-01-01

    Electrospray ionization mass spectrometry (ESI-MS) at flow rates below ~10 nL/min has been only sporadically explored due to difficulty in reproducibly fabricating emitters that can operate at lower flow rates. Here we demonstrate narrow orifice chemically etched emitters for stable electrospray at flow rates as low as 400 pL/min. Depending on the analyte concentration, we observe two types of MS signal response as a function of flow rate. At low concentrations, an optimum flow rate is observed slightly above 1 nL/min, while the signal decreases monotonically with decreasing flow rates at higher concentrations. In spite of lower MS signal, themore » ion utilization efficiency increases exponentially with decreasing flow rate in all cases. No unimolecular response was observed within this flow rate range during the analysis of an equimolar mixture of peptides, indicating that ionization efficiency is an analyte-dependent characteristic in given experimental conditions. While little to no gain in signal-to-noise was achieved at ultralow flow rates for concentration-limited analyses, experiments consuming the same amount of analyte suggest that mass-limited analyses will benefit strongly from the use of low flow rates and avoiding unnecessary sample dilution. By operating under optimal conditions, consumption of just 500 zmol of sample yielded signal-to-noise ratios ~10 for some peptides. These findings have important implications for the analysis of trace biological samples.« less

  20. Occurrence and removal of phenolic endocrine disrupting chemicals in the water treatment processes

    PubMed Central

    Lv, Xuemin; Xiao, Sanhua; Zhang, Gang; Jiang, Pu; Tang, Fei

    2016-01-01

    This paper evaluated the occurrence and removal efficiency of four selected phenolic endocrine disrupting chemicals (bisphenol A (BPA), octylphenol (OP), nonylphenol (NP) and diethylstilbestrol (DES)) in two drinking waterworks in Jiangsu province which take source water from Taihu Lake. The recombined yeast estrogen screen (YES) and liquid chromatography tandem mass spectrometry (LC-MS/MS) were applied to assess the estrogenicity and detect the estrogens in the samples. The estrogen equivalents (EEQs) ranged from nd (not detected) to 2.96 ng/L, and the estrogenic activities decreased along the processes. Among the 32 samples, DES prevailed in all samples, with concentrations ranging 1.46–12.0 ng/L, BPA, OP and NP were partially detected, with concentrations ranging from nd to 17.73 ng/L, nd to 0.49 ng/L and nd to 3.27 ng/L, respectively. DES was found to be the main contributor to the estrogenicity (99.06%), followed by NP (0.62%), OP (0.23%) and BPA (0.09%). From the observation of treatment efficiency, the advanced treatment processes presented much higher removal ratio in reducing DES, the biodegradation played an important role in removing BPA, ozonation and pre-oxidation showed an effective removal on all the four estrogens; while the conventional ones can also reduce all the four estrogens. PMID:26953121

  1. Influence of extrinsic operational parameters on salt diffusion during ultrasound assisted meat curing.

    PubMed

    Inguglia, Elena S; Zhang, Zhihang; Burgess, Catherine; Kerry, Joseph P; Tiwari, Brijesh K

    2018-02-01

    The present study investigated the effect of geometric parameters of the ultrasound instrument during meat salting in order to enhance salt diffusion and salt distribution in pork meat on a lab scale. The effects of probe size (∅2.5 and 1.3cm) and of different distances between the transducer and the meat sample (0.3, 0.5, and 0.8cm) on NaCl diffusion were investigated. Changes in the moisture content and NaCl gain were used to evaluate salt distribution and diffusion in the samples, parallel and perpendicular to ultrasound propagation direction. Results showed that 0.3cm was the most efficient distance between the probe and the sample to ensure a higher salt diffusion rate. A distance of 0.5cm was however considered as a trade-off distance to ensure salt diffusion and maintenance of meat quality parameters. The enhancement of salt diffusion by ultrasound was observed to decrease with increased horizontal distance from the probe. This study is of valuable importance for meat processing industries willing to apply new technologies on a larger scale and with defined operational standards. The data suggest that the geometric parameters of ultrasound systems can have strong influence on the efficiency of ultrasonic enhancement of NaCl uptake in meat and can be a crucial element in determining salt uptake during meat processing. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Geographic Clustering of Cardiometabolic Risk Factors in Metropolitan Centres in France and Australia

    PubMed Central

    Paquet, Catherine; Chaix, Basile; Howard, Natasha J.; Coffee, Neil T.; Adams, Robert J.; Taylor, Anne W.; Thomas, Frédérique; Daniel, Mark

    2016-01-01

    Understanding how health outcomes are spatially distributed represents a first step in investigating the scale and nature of environmental influences on health and has important implications for statistical power and analytic efficiency. Using Australian and French cohort data, this study aimed to describe and compare the extent of geographic variation, and the implications for analytic efficiency, across geographic units, countries and a range of cardiometabolic parameters (Body Mass Index (BMI) waist circumference, blood pressure, resting heart rate, triglycerides, cholesterol, glucose, HbA1c). Geographic clustering was assessed using Intra-Class Correlation (ICC) coefficients in biomedical cohorts from Adelaide (Australia, n = 3893) and Paris (France, n = 6430) for eight geographic administrative units. The median ICC was 0.01 suggesting 1% of risk factor variance attributable to variation between geographic units. Clustering differed by cardiometabolic parameters, administrative units and countries and was greatest for BMI and resting heart rate in the French sample, HbA1c in the Australian sample, and for smaller geographic units. Analytic inefficiency due to clustering was greatest for geographic units in which participants were nested in fewer, larger geographic units. Differences observed in geographic clustering across risk factors have implications for choice of geographic unit in sampling and analysis, and highlight potential cross-country differences in the distribution, or role, of environmental features related to cardiometabolic health. PMID:27213423

  3. Development and Biocompatibility Evaluation of Photocatalytic TiO2/Reduced Graphene Oxide-Based Nanoparticles Designed for Self-Cleaning Purposes

    PubMed Central

    Nica, Ionela Cristina; Stan, Miruna S.; Popa, Marcela; Chifiriuc, Mariana Carmen; Pircalabioru, Gratiela G.; Lazar, Veronica; Dumitrescu, Iuliana; Diamandescu, Lucian; Feder, Marcel; Baibarac, Mihaela; Cernea, Marin; Popescu, Traian; Dinischiotu, Anca

    2017-01-01

    Graphene is widely used in nanotechnologies to amplify the photocatalytic activity of TiO2, but the development of TiO2/graphene composites imposes the assessment of their risk to human and environmental health. Therefore, reduced graphene oxide was decorated with two types of TiO2 particles co-doped with 1% iron and nitrogen, one of them being obtained by a simultaneous precipitation of Ti3+ and Fe3+ ions to achieve their uniform distribution, and the other one after a sequential precipitation of these two cations for a higher concentration of iron on the surface. Physico-chemical characterization, photocatalytic efficiency evaluation, antimicrobial analysis and biocompatibility assessment were performed for these TiO2-based composites. The best photocatalytic efficiency was found for the sample with iron atoms localized at the sample surface. A very good anti-inhibitory activity was obtained for both samples against biofilms of Gram-positive and Gram-negative strains. Exposure of human skin and lung fibroblasts to photocatalysts did not significantly affect cell viability, but analysis of oxidative stress showed increased levels of carbonyl groups and advanced oxidation protein products for both cell lines after 48 h of incubation. Our findings are of major importance by providing useful knowledge for future photocatalytic self-cleaning and biomedical applications of graphene-based materials. PMID:28925946

  4. Quadratic partial eigenvalue assignment in large-scale stochastic dynamic systems for resilient and economic design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, Sonjoy; Goswami, Kundan; Datta, Biswa N.

    2014-12-10

    Failure of structural systems under dynamic loading can be prevented via active vibration control which shifts the damped natural frequencies of the systems away from the dominant range of loading spectrum. The damped natural frequencies and the dynamic load typically show significant variations in practice. A computationally efficient methodology based on quadratic partial eigenvalue assignment technique and optimization under uncertainty has been formulated in the present work that will rigorously account for these variations and result in an economic and resilient design of structures. A novel scheme based on hierarchical clustering and importance sampling is also developed in this workmore » for accurate and efficient estimation of probability of failure to guarantee the desired resilience level of the designed system. Numerical examples are presented to illustrate the proposed methodology.« less

  5. Activity and growth efficiency of heterotrophic bacteria in a salt marsh (Ria de Aveiro, Portugal).

    PubMed

    Cunha, M A; Pedro, R; Almeida, M A; Silva, M H

    2005-01-01

    Bacterial utilization of monomers is recognized as an important step in the biogeochemical cycling of organic matter. In this study we have compared the heterotrophic activity of bacterial communities from different micro-habitats within a salt marsh environment (Ria de Aveiro, Portugal) in order to establish spatial patterns of bacterial abundance, monomer turnover rates (Tr) and bacterial growth efficiency (BGE). Differences in bacterial abundance and activity could be found between distinct plant rhizospheres. BGE tended to be lower at Halimione portulacoides banks, when compared to Sarcocornia perennis subsp. perennis banks which, on the contrary, showed the highest bacterial densities. Experiments of amendment of natural samples with organic and inorganic supplements indicated that salt marsh bacteria are not strongly regulated by salinity but the increased availability of labile organic matter causes a significant metabolic shift towards mineralization.

  6. More ethical and more efficient clinical research: multiplex trial design.

    PubMed

    Keus, Frederik; van der Horst, Iwan C C; Nijsten, Maarten W

    2014-08-14

    Today's clinical research faces challenges such as a lack of clinical equipoise between treatment arms, reluctance in randomizing for multiple treatments simultaneously, inability to address interactions and increasingly restricted resources. Furthermore, many trials are biased by extensive exclusion criteria, relatively small sample size and less appropriate outcome measures. We propose a 'Multiplex' trial design that preserves clinical equipoise with a continuous and factorial trial design that will also result in more efficient use of resources. This multiplex design accommodates subtrials with appropriate choice of treatment arms within each subtrial. Clinical equipoise should increase consent rates while the factorial design is the best way to identify interactions. The multiplex design may evolve naturally from today's research limitations and challenges, while principal objections seem absent. However this new design poses important infrastructural, organisational and psychological challenges that need in depth consideration.

  7. Thermally activated persistent photoconductivity & donor binding energy in high mobility AlAs QWs

    NASA Astrophysics Data System (ADS)

    Dasgupta, S.; Knaak, C.; Fontcuberta, A.; Bichler, M.; Abstreiter, G.; Grayson, M.

    2008-03-01

    In AlAs, valley index is important quantum number which can help understand interactions. However, important parameters for growth such as donor binding energy and Si δ-doping efficiency were unknown and AlAs quantum wells (QWs) typically did not conduct in dark. We grew series of (001) and (110) oriented double-sided doped n-type AlAs QWs and deduced Si donor binding energy δ in Al0.45Ga0.55 As and doping efficiency η. They work in dark possibly because dilute charge traps in substrate are screened by backside doping. From dark saturation density for doping series we deduced δdk=65.2 meV [1]. Our studies show thermally activated PPC where sample is illuminated at 4 K and returned to dark without appreciable density increase. As temperature is increased to 30 K, density doubles, indicating shallow binding energy δPIA=0 meV post-illumination anneal (PIA). Doping efficiency after illumination for (001) facet was found to be η=n2D/nSi=35% and for (110) η=17%. With this understanding, we designed (001) AlAs QW with PIA density n=2.4 x 10^11 cm-2 and mobility μ=4.3 x 10^5 cm^2/Vs(330 mK), improvement of almost an order of magnitude over published results. [1] Dasgupta, et al. APL (2007)

  8. Design of telehealth trials--introducing adaptive approaches.

    PubMed

    Law, Lisa M; Wason, James M S

    2014-12-01

    The field of telehealth and telemedicine is expanding as the need to improve efficiency of health care becomes more pressing. The decision to implement a telehealth system is generally an expensive undertaking that impacts a large number of patients and other stakeholders. It is therefore extremely important that the decision is fully supported by accurate evaluation of telehealth interventions. Numerous reviews of telehealth have described the evidence base as inconsistent. In response they call for larger, more rigorously controlled trials, and trials which go beyond evaluation of clinical effectiveness alone. The aim of this paper is to discuss various ways in which evaluation of telehealth could be improved by the use of adaptive trial designs. We discuss various adaptive design options, such as sample size reviews and changing the study hypothesis to address uncertain parameters, group sequential trials and multi-arm multi-stage trials to improve efficiency, and enrichment designs to maximise the chances of obtaining clear evidence about the telehealth intervention. There is potential to address the flaws discussed in the telehealth literature through the adoption of adaptive approaches to trial design. Such designs could lead to improvements in efficiency, allow the evaluation of multiple telehealth interventions in a cost-effective way, or accurately assess a range of endpoints that are important in the overall success of a telehealth programme. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  9. Efficient Actor-Critic Algorithm with Hierarchical Model Learning and Planning

    PubMed Central

    Fu, QiMing

    2016-01-01

    To improve the convergence rate and the sample efficiency, two efficient learning methods AC-HMLP and RAC-HMLP (AC-HMLP with ℓ 2-regularization) are proposed by combining actor-critic algorithm with hierarchical model learning and planning. The hierarchical models consisting of the local and the global models, which are learned at the same time during learning of the value function and the policy, are approximated by local linear regression (LLR) and linear function approximation (LFA), respectively. Both the local model and the global model are applied to generate samples for planning; the former is used only if the state-prediction error does not surpass the threshold at each time step, while the latter is utilized at the end of each episode. The purpose of taking both models is to improve the sample efficiency and accelerate the convergence rate of the whole algorithm through fully utilizing the local and global information. Experimentally, AC-HMLP and RAC-HMLP are compared with three representative algorithms on two Reinforcement Learning (RL) benchmark problems. The results demonstrate that they perform best in terms of convergence rate and sample efficiency. PMID:27795704

  10. Efficient Actor-Critic Algorithm with Hierarchical Model Learning and Planning.

    PubMed

    Zhong, Shan; Liu, Quan; Fu, QiMing

    2016-01-01

    To improve the convergence rate and the sample efficiency, two efficient learning methods AC-HMLP and RAC-HMLP (AC-HMLP with ℓ 2 -regularization) are proposed by combining actor-critic algorithm with hierarchical model learning and planning. The hierarchical models consisting of the local and the global models, which are learned at the same time during learning of the value function and the policy, are approximated by local linear regression (LLR) and linear function approximation (LFA), respectively. Both the local model and the global model are applied to generate samples for planning; the former is used only if the state-prediction error does not surpass the threshold at each time step, while the latter is utilized at the end of each episode. The purpose of taking both models is to improve the sample efficiency and accelerate the convergence rate of the whole algorithm through fully utilizing the local and global information. Experimentally, AC-HMLP and RAC-HMLP are compared with three representative algorithms on two Reinforcement Learning (RL) benchmark problems. The results demonstrate that they perform best in terms of convergence rate and sample efficiency.

  11. Usage of a selective media (EMJH-STAFF) in primary culturing of pathogenic leptospires from bovine clinical samples.

    PubMed

    Loureiro, A P; Martins, G; Pinto, P; Narduche, L; Teixeira, R C; Lilenbaum, W

    2015-12-01

    Isolation of local strains is mandatory for the success of control programs. However, clinical samples are typically contaminated by other bacteria, which impair leptospires growth. The purpose of this study was to evaluate the use of a previously reported EMJH-STAFF media in the recovery of pathogenic leptospires from bovine clinical samples, namely urine (n = 123) and vaginal fluid-VF (n = 102). EMJH-STAFF presented less contamination than EMJH (<0·005), which was more evident in VF culture tubes. Nine pure leptospires cultures were obtained, six from urine (4·9%) and three from VF (2·9%). From those, seven grew on EMJH-STAFF, one on EMJH and one in both media. All the isolates were confirmed as pathogenic leptospires by lipL32-PCR, and sequencing of partial rrs showed them to belong to Leptospira noguchii, Leptospira santarosai and Leptospira interrogans species. EMJH-STAFF media was an important tool in the recovery of leptospires from bovine clinical samples. The slow growth of leptospires and overgrowth of co-existing micro-organisms from environmental and microbiota are the major difficult to recovery Leptospira from animal clinical samples. Implementing an efficient control programme is essential to determine circulating leptospires in the region and their reservoirs. This study evaluated the relationship of a selective media (EMJH-STAFF) on the recovery of pathogenic leptospires (Leptospira noguchii, Leptospira santarosai and Leptospira interrogans), from bovine clinical samples (urine and vaginal fluid). EMJH-STAFF seems to be an important tool in obtaining local strains for epidemiological and control purposes. © 2015 The Society for Applied Microbiology.

  12. Generating probabilistic Boolean networks from a prescribed transition probability matrix.

    PubMed

    Ching, W-K; Chen, X; Tsing, N-K

    2009-11-01

    Probabilistic Boolean networks (PBNs) have received much attention in modeling genetic regulatory networks. A PBN can be regarded as a Markov chain process and is characterised by a transition probability matrix. In this study, the authors propose efficient algorithms for constructing a PBN when its transition probability matrix is given. The complexities of the algorithms are also analysed. This is an interesting inverse problem in network inference using steady-state data. The problem is important as most microarray data sets are assumed to be obtained from sampling the steady-state.

  13. Space resection model calculation based on Random Sample Consensus algorithm

    NASA Astrophysics Data System (ADS)

    Liu, Xinzhu; Kang, Zhizhong

    2016-03-01

    Resection has been one of the most important content in photogrammetry. It aims at the position and attitude information of camera at the shooting point. However in some cases, the observed values for calculating are with gross errors. This paper presents a robust algorithm that using RANSAC method with DLT model can effectually avoiding the difficulties to determine initial values when using co-linear equation. The results also show that our strategies can exclude crude handicap and lead to an accurate and efficient way to gain elements of exterior orientation.

  14. GARLIC: GAmma Reconstruction at a LInear Collider experiment

    NASA Astrophysics Data System (ADS)

    Jeans, D.; Brient, J.-C.; Reinhard, M.

    2012-06-01

    The precise measurement of hadronic jet energy is crucial to maximise the physics reach of a future Linear Collider. An important ingredient required to achieve this is the efficient identification of photons within hadronic showers. One configuration of the ILD detector concept employs a highly granular silicon-tungsten sampling calorimeter to identify and measure photons, and the GARLIC algorithm described in this paper has been developed to identify photons in such a calorimeter. We describe the algorithm and characterise its performance using events fully simulated in a model of the ILD detector.

  15. Data Mining Methods for Recommender Systems

    NASA Astrophysics Data System (ADS)

    Amatriain, Xavier; Jaimes*, Alejandro; Oliver, Nuria; Pujol, Josep M.

    In this chapter, we give an overview of the main Data Mining techniques used in the context of Recommender Systems. We first describe common preprocessing methods such as sampling or dimensionality reduction. Next, we review the most important classification techniques, including Bayesian Networks and Support Vector Machines. We describe the k-means clustering algorithm and discuss several alternatives. We also present association rules and related algorithms for an efficient training process. In addition to introducing these techniques, we survey their uses in Recommender Systems and present cases where they have been successfully applied.

  16. [On the importance of the steam trap to the efficient sterilization of solutions in stored blood bottles by saturated steam under pressure (author's transl)].

    PubMed

    Schreiber, M; Göbel, M

    1979-01-01

    Biological tests with soil samples were performed to fix the sterilization time for a new steam sterilizer. These tests yielded repeatedly positive spore findings despite modifications of the conditions of sterilization. Having excluded a series of possible sources of trouble, the authors stated that the quality of the steam was the assignable cause. After restoration of the functionality of the steam traps, the biological tests yielded negative results also under normal conditions of sterilization.

  17. Semi-automatic engineering and tailoring of high-efficiency Bragg-reflection waveguide samples for quantum photonic applications

    NASA Astrophysics Data System (ADS)

    Pressl, B.; Laiho, K.; Chen, H.; Günthner, T.; Schlager, A.; Auchter, S.; Suchomel, H.; Kamp, M.; Höfling, S.; Schneider, C.; Weihs, G.

    2018-04-01

    Semiconductor alloys of aluminum gallium arsenide (AlGaAs) exhibit strong second-order optical nonlinearities. This makes them prime candidates for the integration of devices for classical nonlinear optical frequency conversion or photon-pair production, for example, through the parametric down-conversion (PDC) process. Within this material system, Bragg-reflection waveguides (BRW) are a promising platform, but the specifics of the fabrication process and the peculiar optical properties of the alloys require careful engineering. Previously, BRW samples have been mostly derived analytically from design equations using a fixed set of aluminum concentrations. This approach limits the variety and flexibility of the device design. Here, we present a comprehensive guide to the design and analysis of advanced BRW samples and show how to automatize these tasks. Then, nonlinear optimization techniques are employed to tailor the BRW epitaxial structure towards a specific design goal. As a demonstration of our approach, we search for the optimal effective nonlinearity and mode overlap which indicate an improved conversion efficiency or PDC pair production rate. However, the methodology itself is much more versatile as any parameter related to the optical properties of the waveguide, for example the phasematching wavelength or modal dispersion, may be incorporated as design goals. Further, we use the developed tools to gain a reliable insight in the fabrication tolerances and challenges of real-world sample imperfections. One such example is the common thickness gradient along the wafer, which strongly influences the photon-pair rate and spectral properties of the PDC process. Detailed models and a better understanding of the optical properties of a realistic BRW structure are not only useful for investigating current samples, but also provide important feedback for the design and fabrication of potential future turn-key devices.

  18. Finite element model updating using the shadow hybrid Monte Carlo technique

    NASA Astrophysics Data System (ADS)

    Boulkaibet, I.; Mthembu, L.; Marwala, T.; Friswell, M. I.; Adhikari, S.

    2015-02-01

    Recent research in the field of finite element model updating (FEM) advocates the adoption of Bayesian analysis techniques to dealing with the uncertainties associated with these models. However, Bayesian formulations require the evaluation of the Posterior Distribution Function which may not be available in analytical form. This is the case in FEM updating. In such cases sampling methods can provide good approximations of the Posterior distribution when implemented in the Bayesian context. Markov Chain Monte Carlo (MCMC) algorithms are the most popular sampling tools used to sample probability distributions. However, the efficiency of these algorithms is affected by the complexity of the systems (the size of the parameter space). The Hybrid Monte Carlo (HMC) offers a very important MCMC approach to dealing with higher-dimensional complex problems. The HMC uses the molecular dynamics (MD) steps as the global Monte Carlo (MC) moves to reach areas of high probability where the gradient of the log-density of the Posterior acts as a guide during the search process. However, the acceptance rate of HMC is sensitive to the system size as well as the time step used to evaluate the MD trajectory. To overcome this limitation we propose the use of the Shadow Hybrid Monte Carlo (SHMC) algorithm. The SHMC algorithm is a modified version of the Hybrid Monte Carlo (HMC) and designed to improve sampling for large-system sizes and time steps. This is done by sampling from a modified Hamiltonian function instead of the normal Hamiltonian function. In this paper, the efficiency and accuracy of the SHMC method is tested on the updating of two real structures; an unsymmetrical H-shaped beam structure and a GARTEUR SM-AG19 structure and is compared to the application of the HMC algorithm on the same structures.

  19. Towards fast, rigorous and efficient conformational sampling of biomolecules: Advances in accelerated molecular dynamics.

    PubMed

    Doshi, Urmi; Hamelberg, Donald

    2015-05-01

    Accelerated molecular dynamics (aMD) has been proven to be a powerful biasing method for enhanced sampling of biomolecular conformations on general-purpose computational platforms. Biologically important long timescale events that are beyond the reach of standard molecular dynamics can be accessed without losing the detailed atomistic description of the system in aMD. Over other biasing methods, aMD offers the advantages of tuning the level of acceleration to access the desired timescale without any advance knowledge of the reaction coordinate. Recent advances in the implementation of aMD and its applications to small peptides and biological macromolecules are reviewed here along with a brief account of all the aMD variants introduced in the last decade. In comparison to the original implementation of aMD, the recent variant in which all the rotatable dihedral angles are accelerated (RaMD) exhibits faster convergence rates and significant improvement in statistical accuracy of retrieved thermodynamic properties. RaMD in conjunction with accelerating diffusive degrees of freedom, i.e. dual boosting, has been rigorously tested for the most difficult conformational sampling problem, protein folding. It has been shown that RaMD with dual boosting is capable of efficiently sampling multiple folding and unfolding events in small fast folding proteins. RaMD with the dual boost approach opens exciting possibilities for sampling multiple timescales in biomolecules. While equilibrium properties can be recovered satisfactorily from aMD-based methods, directly obtaining dynamics and kinetic rates for larger systems presents a future challenge. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Spatial and Temporal Distribution of Multiple Cropping Indices in the North China Plain Using a Long Remote Sensing Data Time Series.

    PubMed

    Zhao, Yan; Bai, Linyan; Feng, Jianzhong; Lin, Xiaosong; Wang, Li; Xu, Lijun; Ran, Qiyun; Wang, Kui

    2016-04-19

    Multiple cropping provides China with a very important system of intensive cultivation, and can effectively enhance the efficiency of farmland use while improving regional food production and security. A multiple cropping index (MCI), which represents the intensity of multiple cropping and reflects the effects of climate change on agricultural production and cropping systems, often serves as a useful parameter. Therefore, monitoring the dynamic changes in the MCI of farmland over a large area using remote sensing data is essential. For this purpose, nearly 30 years of MCIs related to dry land in the North China Plain (NCP) were efficiently extracted from remotely sensed leaf area index (LAI) data from the Global LAnd Surface Satellite (GLASS). Next, the characteristics of the spatial-temporal change in MCI were analyzed. First, 2162 typical arable sample sites were selected based on a gridded spatial sampling strategy, and then the LAI information was extracted from the samples. Second, the Savizky-Golay filter was used to smooth the LAI time-series data of the samples, and then the MCIs of the samples were obtained using a second-order difference algorithm. Finally, the geo-statistical Kriging method was employed to map the spatial distribution of the MCIs and to obtain a time-series dataset of the MCIs of dry land over the NCP. The results showed that all of the MCIs in the NCP showed an increasing trend over the entire study period and increased most rapidly from 1982 to 2002. Spatially, MCIs decreased from south to north; also, high MCIs were mainly concentrated in the relatively flat areas. In addition, the partial spatial changes of MCIs had clear geographical characteristics, with the largest change in Henan Province.

  1. Spatial and Temporal Distribution of Multiple Cropping Indices in the North China Plain Using a Long Remote Sensing Data Time Series

    PubMed Central

    Zhao, Yan; Bai, Linyan; Feng, Jianzhong; Lin, Xiaosong; Wang, Li; Xu, Lijun; Ran, Qiyun; Wang, Kui

    2016-01-01

    Multiple cropping provides China with a very important system of intensive cultivation, and can effectively enhance the efficiency of farmland use while improving regional food production and security. A multiple cropping index (MCI), which represents the intensity of multiple cropping and reflects the effects of climate change on agricultural production and cropping systems, often serves as a useful parameter. Therefore, monitoring the dynamic changes in the MCI of farmland over a large area using remote sensing data is essential. For this purpose, nearly 30 years of MCIs related to dry land in the North China Plain (NCP) were efficiently extracted from remotely sensed leaf area index (LAI) data from the Global LAnd Surface Satellite (GLASS). Next, the characteristics of the spatial-temporal change in MCI were analyzed. First, 2162 typical arable sample sites were selected based on a gridded spatial sampling strategy, and then the LAI information was extracted from the samples. Second, the Savizky-Golay filter was used to smooth the LAI time-series data of the samples, and then the MCIs of the samples were obtained using a second-order difference algorithm. Finally, the geo-statistical Kriging method was employed to map the spatial distribution of the MCIs and to obtain a time-series dataset of the MCIs of dry land over the NCP. The results showed that all of the MCIs in the NCP showed an increasing trend over the entire study period and increased most rapidly from 1982 to 2002. Spatially, MCIs decreased from south to north; also, high MCIs were mainly concentrated in the relatively flat areas. In addition, the partial spatial changes of MCIs had clear geographical characteristics, with the largest change in Henan Province. PMID:27104536

  2. Aerosol Sampling with Low Wind Sensitivity.

    NASA Astrophysics Data System (ADS)

    Kalatoor, Suresh

    Occupational exposure to airborne particles is generally evaluated by wearing a personal sampler that collects aerosol particles from the worker's breathing zone during the work cycle. The overall sampling efficiency of most currently available samplers is sensitive to wind velocity and direction. In addition, most samplers have internal losses due to gravitational settling, electrostatic interactions, and internal turbulence. A new sampling technique has been developed, theoretically and experimentally evaluated, and compared to existing techniques. The overall sampling efficiency of the protoype sampler was compared to that of a commonly used sampler, 25 mm closed-face cassette. Uranine was used as the challange aerosol with particle physical diameters 13.5, 20 and 30 mum. The wind velocity ranged from 100 to 300 cm s^ {-1}. It was found to have less internal losses and less dependence on wind velocity and direction. It also yielded better uniformity in the distribution of large particles on the filter surface, an advantage for several types of analysis. A new general equation for sharp-edged inlets was developed that predicts the sampling efficiency of sharp-edged (or thin-walled) inlets in most occupational environments that are weakly disturbed with air motions that cannot be strictly classified as calm-air or fast -moving air. Computational analysis was carried out using the new general equation and was applied to situations when the wind velocity vector is not steady, but fluctuates around predominant average values of its magnitude and orientation. Two sampling environments, horizontal aerosol flow (ambient atmosphere) and vertical aerosol flow (industrial stacks) have been considered. It was found, that even for small fluctuations in wind direction the sampling efficiency may be significantly less than that obtained for the mean wind direction. Time variations in wind magnitude at a fixed wind direction were found to affect the sampling efficiency to a lesser degree. This led to the development of a new sampling technique that significantly improved the sampling characteristics of the inlet. The newly-developed inlet has a curved surface with evenly spaced sampling orifices. Visualization of the streamlines over the sampler and limiting-streamline quantitative analysis showed negligible turbulence effects due to the sampler inlet's geometry. The overall sampling efficiency was found to be superior over the commonly used 25-mm closed-face cassette.

  3. Composting Explosives/Organics Contaminated Soils

    DTIC Science & Technology

    1986-05-01

    29 144. Quantitation of C Trapped by Activated Carbon . ... 29 5. Preliminary Extraction Trials .... ........ ..... . 30 6. Tetryl Product...ppm (standard deviation 1892 ppm). All samples of soil from Letterkenny AD were pooled to yield one composite sample. Pooled samples from Louisiana...combustion efficiency, and counting efficiency. 4. Quantitation of 14 C Trapped by Activated Carbon Random subsamples of carbon from the air intake

  4. Efficient sampling over rough energy landscapes with high barriers: A combination of metadynamics with integrated tempering sampling.

    PubMed

    Yang, Y Isaac; Zhang, Jun; Che, Xing; Yang, Lijiang; Gao, Yi Qin

    2016-03-07

    In order to efficiently overcome high free energy barriers embedded in a complex energy landscape and calculate overall thermodynamics properties using molecular dynamics simulations, we developed and implemented a sampling strategy by combining the metadynamics with (selective) integrated tempering sampling (ITS/SITS) method. The dominant local minima on the potential energy surface (PES) are partially exalted by accumulating history-dependent potentials as in metadynamics, and the sampling over the entire PES is further enhanced by ITS/SITS. With this hybrid method, the simulated system can be rapidly driven across the dominant barrier along selected collective coordinates. Then, ITS/SITS ensures a fast convergence of the sampling over the entire PES and an efficient calculation of the overall thermodynamic properties of the simulation system. To test the accuracy and efficiency of this method, we first benchmarked this method in the calculation of ϕ - ψ distribution of alanine dipeptide in explicit solvent. We further applied it to examine the design of template molecules for aromatic meta-C-H activation in solutions and investigate solution conformations of the nonapeptide Bradykinin involving slow cis-trans isomerizations of three proline residues.

  5. Efficient sampling over rough energy landscapes with high barriers: A combination of metadynamics with integrated tempering sampling

    NASA Astrophysics Data System (ADS)

    Yang, Y. Isaac; Zhang, Jun; Che, Xing; Yang, Lijiang; Gao, Yi Qin

    2016-03-01

    In order to efficiently overcome high free energy barriers embedded in a complex energy landscape and calculate overall thermodynamics properties using molecular dynamics simulations, we developed and implemented a sampling strategy by combining the metadynamics with (selective) integrated tempering sampling (ITS/SITS) method. The dominant local minima on the potential energy surface (PES) are partially exalted by accumulating history-dependent potentials as in metadynamics, and the sampling over the entire PES is further enhanced by ITS/SITS. With this hybrid method, the simulated system can be rapidly driven across the dominant barrier along selected collective coordinates. Then, ITS/SITS ensures a fast convergence of the sampling over the entire PES and an efficient calculation of the overall thermodynamic properties of the simulation system. To test the accuracy and efficiency of this method, we first benchmarked this method in the calculation of ϕ - ψ distribution of alanine dipeptide in explicit solvent. We further applied it to examine the design of template molecules for aromatic meta-C—H activation in solutions and investigate solution conformations of the nonapeptide Bradykinin involving slow cis-trans isomerizations of three proline residues.

  6. Efficient sampling over rough energy landscapes with high barriers: A combination of metadynamics with integrated tempering sampling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y. Isaac; Zhang, Jun; Che, Xing

    2016-03-07

    In order to efficiently overcome high free energy barriers embedded in a complex energy landscape and calculate overall thermodynamics properties using molecular dynamics simulations, we developed and implemented a sampling strategy by combining the metadynamics with (selective) integrated tempering sampling (ITS/SITS) method. The dominant local minima on the potential energy surface (PES) are partially exalted by accumulating history-dependent potentials as in metadynamics, and the sampling over the entire PES is further enhanced by ITS/SITS. With this hybrid method, the simulated system can be rapidly driven across the dominant barrier along selected collective coordinates. Then, ITS/SITS ensures a fast convergence ofmore » the sampling over the entire PES and an efficient calculation of the overall thermodynamic properties of the simulation system. To test the accuracy and efficiency of this method, we first benchmarked this method in the calculation of ϕ − ψ distribution of alanine dipeptide in explicit solvent. We further applied it to examine the design of template molecules for aromatic meta-C—H activation in solutions and investigate solution conformations of the nonapeptide Bradykinin involving slow cis-trans isomerizations of three proline residues.« less

  7. Determination of Minimum Training Sample Size for Microarray-Based Cancer Outcome Prediction–An Empirical Assessment

    PubMed Central

    Cheng, Ningtao; Wu, Leihong; Cheng, Yiyu

    2013-01-01

    The promise of microarray technology in providing prediction classifiers for cancer outcome estimation has been confirmed by a number of demonstrable successes. However, the reliability of prediction results relies heavily on the accuracy of statistical parameters involved in classifiers. It cannot be reliably estimated with only a small number of training samples. Therefore, it is of vital importance to determine the minimum number of training samples and to ensure the clinical value of microarrays in cancer outcome prediction. We evaluated the impact of training sample size on model performance extensively based on 3 large-scale cancer microarray datasets provided by the second phase of MicroArray Quality Control project (MAQC-II). An SSNR-based (scale of signal-to-noise ratio) protocol was proposed in this study for minimum training sample size determination. External validation results based on another 3 cancer datasets confirmed that the SSNR-based approach could not only determine the minimum number of training samples efficiently, but also provide a valuable strategy for estimating the underlying performance of classifiers in advance. Once translated into clinical routine applications, the SSNR-based protocol would provide great convenience in microarray-based cancer outcome prediction in improving classifier reliability. PMID:23861920

  8. A Dictionary Learning Approach for Signal Sampling in Task-Based fMRI for Reduction of Big Data

    PubMed Central

    Ge, Bao; Li, Xiang; Jiang, Xi; Sun, Yifei; Liu, Tianming

    2018-01-01

    The exponential growth of fMRI big data offers researchers an unprecedented opportunity to explore functional brain networks. However, this opportunity has not been fully explored yet due to the lack of effective and efficient tools for handling such fMRI big data. One major challenge is that computing capabilities still lag behind the growth of large-scale fMRI databases, e.g., it takes many days to perform dictionary learning and sparse coding of whole-brain fMRI data for an fMRI database of average size. Therefore, how to reduce the data size but without losing important information becomes a more and more pressing issue. To address this problem, we propose a signal sampling approach for significant fMRI data reduction before performing structurally-guided dictionary learning and sparse coding of whole brain's fMRI data. We compared the proposed structurally guided sampling method with no sampling, random sampling and uniform sampling schemes, and experiments on the Human Connectome Project (HCP) task fMRI data demonstrated that the proposed method can achieve more than 15 times speed-up without sacrificing the accuracy in identifying task-evoked functional brain networks. PMID:29706880

  9. A Dictionary Learning Approach for Signal Sampling in Task-Based fMRI for Reduction of Big Data.

    PubMed

    Ge, Bao; Li, Xiang; Jiang, Xi; Sun, Yifei; Liu, Tianming

    2018-01-01

    The exponential growth of fMRI big data offers researchers an unprecedented opportunity to explore functional brain networks. However, this opportunity has not been fully explored yet due to the lack of effective and efficient tools for handling such fMRI big data. One major challenge is that computing capabilities still lag behind the growth of large-scale fMRI databases, e.g., it takes many days to perform dictionary learning and sparse coding of whole-brain fMRI data for an fMRI database of average size. Therefore, how to reduce the data size but without losing important information becomes a more and more pressing issue. To address this problem, we propose a signal sampling approach for significant fMRI data reduction before performing structurally-guided dictionary learning and sparse coding of whole brain's fMRI data. We compared the proposed structurally guided sampling method with no sampling, random sampling and uniform sampling schemes, and experiments on the Human Connectome Project (HCP) task fMRI data demonstrated that the proposed method can achieve more than 15 times speed-up without sacrificing the accuracy in identifying task-evoked functional brain networks.

  10. Notification: CTS Asheville Superfund Site Update: Sampling, Monitoring, Communication and Opportunities for Cleanup Efficiencies

    EPA Pesticide Factsheets

    Project #OPE-FY14-0044, July 22, 2014. The EPA OIG plans to begin preliminary research of the EPA's sampling, monitoring, communication and opportunities for cleanup efficiencies for the CTS Asheville Superfund Site, North Carolina.

  11. Hierarchical Surface Architecture of Plants as an Inspiration for Biomimetic Fog Collectors.

    PubMed

    Azad, M A K; Barthlott, W; Koch, K

    2015-12-08

    Fog collectors can enable us to alleviate the water crisis in certain arid regions of the world. A continuous fog-collection cycle consisting of a persistent capture of fog droplets and their fast transport to the target is a prerequisite for developing an efficient fog collector. In regard to this topic, a biological superior design has been found in the hierarchical surface architecture of barley (Hordeum vulgare) awns. We demonstrate here the highly wettable (advancing contact angle 16° ± 2.7 and receding contact angle 9° ± 2.6) barbed (barb = conical structure) awn as a model to develop optimized fog collectors with a high fog-capturing capability, an effective water transport, and above all an efficient fog collection. We compare the fog-collection efficiency of the model sample with other plant samples naturally grown in foggy habitats that are supposed to be very efficient fog collectors. The model sample, consisting of dry hydrophilized awns (DH awns), is found to be about twice as efficient (fog-collection rate 563.7 ± 23.2 μg/cm(2) over 10 min) as any other samples investigated under controlled experimental conditions. Finally, a design based on the hierarchical surface architecture of the model sample is proposed for the development of optimized biomimetic fog collectors.

  12. Efficient SRAM yield optimization with mixture surrogate modeling

    NASA Astrophysics Data System (ADS)

    Zhongjian, Jiang; Zuochang, Ye; Yan, Wang

    2016-12-01

    Largely repeated cells such as SRAM cells usually require extremely low failure-rate to ensure a moderate chi yield. Though fast Monte Carlo methods such as importance sampling and its variants can be used for yield estimation, they are still very expensive if one needs to perform optimization based on such estimations. Typically the process of yield calculation requires a lot of SPICE simulation. The circuit SPICE simulation analysis accounted for the largest proportion of time in the process yield calculation. In the paper, a new method is proposed to address this issue. The key idea is to establish an efficient mixture surrogate model. The surrogate model is based on the design variables and process variables. This model construction method is based on the SPICE simulation to get a certain amount of sample points, these points are trained for mixture surrogate model by the lasso algorithm. Experimental results show that the proposed model is able to calculate accurate yield successfully and it brings significant speed ups to the calculation of failure rate. Based on the model, we made a further accelerated algorithm to further enhance the speed of the yield calculation. It is suitable for high-dimensional process variables and multi-performance applications.

  13. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  14. Empowering a safer practice: PDAs are integral tools for nursing and health care.

    PubMed

    Hudson, Kathleen; Buell, Virginia

    2011-04-01

    This study's purpose was to assess the characteristics of personal digital assistant (PDA) uptake and use in both clinical and classroom work for baccalaureate student nurses (BSN) within a rural Texas university. Patient care has become more complicated, risk prone, automated and costly. Efficiencies at the bedside are needed to continue to provide safe and successful within this environment. Purposive sample of nursing students using PDAs throughout their educational processes, conducted at three campus sites. The initial sample size was 105 students, followed by 94 students at end of the first semester and 75 students at curriculum completion at the end of a 2-year period. Students completed structured and open-ended questions to assess their perspectives on PDA usage. Student uptake varied in relation to overall competency, with minimal to high utilization noted, and was influenced by the current product costs. PDAs are developing into useful clinical tools by providing quick and important information for safer care. Using bedside PDAs effectively assists with maintaining patient safety, efficiency of care delivery and staff satisfaction. This study evaluates the initial implementation of PDAs by students, our future multitasking nurses. © 2011 The Authors. Journal compilation © 2011 Blackwell Publishing Ltd.

  15. Simple, quick and cost-efficient: A universal RT-PCR and sequencing strategy for genomic characterisation of foot-and-mouth disease viruses.

    PubMed

    Dill, V; Beer, M; Hoffmann, B

    2017-08-01

    Foot-and-mouth disease (FMD) is a major contributor to poverty and food insecurity in Africa and Asia, and it is one of the biggest threats to agriculture in highly developed countries. As FMD is extremely contagious, strategies for its prevention, early detection, and the immediate characterisation of outbreak strains are of great importance. The generation of whole-genome sequences enables phylogenetic characterisation, the epidemiological tracing of virus transmission pathways and is supportive in disease control strategies. This study describes the development and validation of a rapid, universal and cost-efficient RT-PCR system to generate genome sequences of FMDV, reaching from the IRES to the end of the open reading frame. The method was evaluated using twelve different virus strains covering all seven serotypes of FMDV. Additionally, samples from experimentally infected animals were tested to mimic diagnostic field samples. All primer pairs showed a robust amplification with a high sensitivity for all serotypes. In summary, the described assay is suitable for the generation of FMDV sequences from all serotypes to allow immediate phylogenetic analysis, detailed genotyping and molecular epidemiology. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  17. An efficient data mining framework for the characterization of symptomatic and asymptomatic carotid plaque using bidimensional empirical mode decomposition technique.

    PubMed

    Molinari, Filippo; Raghavendra, U; Gudigar, Anjan; Meiburger, Kristen M; Rajendra Acharya, U

    2018-02-23

    Atherosclerosis is a type of cardiovascular disease which may cause stroke. It is due to the deposition of fatty plaque in the artery walls resulting in the reduction of elasticity gradually and hence restricting the blood flow to the heart. Hence, an early prediction of carotid plaque deposition is important, as it can save lives. This paper proposes a novel data mining framework for the assessment of atherosclerosis in its early stage using ultrasound images. In this work, we are using 1353 symptomatic and 420 asymptomatic carotid plaque ultrasound images. Our proposed method classifies the symptomatic and asymptomatic carotid plaques using bidimensional empirical mode decomposition (BEMD) and entropy features. The unbalanced data samples are compensated using adaptive synthetic sampling (ADASYN), and the developed method yielded a promising accuracy of 91.43%, sensitivity of 97.26%, and specificity of 83.22% using fourteen features. Hence, the proposed method can be used as an assisting tool during the regular screening of carotid arteries in hospitals. Graphical abstract Outline for our efficient data mining framework for the characterization of symptomatic and asymptomatic carotid plaques.

  18. Radiolytic degradation of a new diglycol-diamide ligand for actinide and lanthanide co-extraction from spent nuclear fuel

    NASA Astrophysics Data System (ADS)

    Ossola, Annalisa; Macerata, Elena; Tinonin, Dario A.; Faroldi, Federica; Giola, Marco; Mariani, Mario; Casnati, Alessandro

    2016-07-01

    Within the Partitioning and Transmutation strategies, great efforts have been devoted in the last decades to the development of lipophilic ligands able to co-extract trivalent Lanthanides (Ln) and Actinides (An) from spent nuclear fuel. Because of the harsh working conditions these ligands undergo, it is important to prove their chemical and radiolytic stability during the counter-current multi-stage extraction process. In the present work the hydrolytic and radiolytic resistance of the freshly prepared and aged organic solutions containing the new ligand (2,6-bis[(N-methyl-N-dodecyl)carboxamide]-4-methoxy-tetrahydro-pyran) were investigated in order to evaluate the impact on the safety and efficiency of the process. Liquid-liquid extraction tests with spiked solutions showed that the ligand extracting performances are strongly impaired by storing the samples at room temperature and in the light. Moreover, the extracting efficiency of the irradiated samples resulted to be influenced by gamma irradiation, while selectivity remains unchanged. Preliminary mass spectrometric data showed that degradation is mainly due to the acid-catalysed reaction of the ligand carboxamide and ether groups with the 1-octanol present in the diluent.

  19. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  20. Flow properties and chemical composition of carob (Ceratonia siliqua L.) flours as related to particle size and seed presence.

    PubMed

    Benković, Maja; Belščak-Cvitanović, Ana; Bauman, Ingrid; Komes, Draženka; Srečec, Siniša

    2017-10-01

    Due to abundance in carbohydrates, dietary fibres and bioactive compounds, as well as for its outspread and low prices, carob (Ceratonia siliqua L.) flour has a great potential of use as a functional ingredient. The aim of this study was to analyse this potential by physical and chemical properties assessment of different particle sizes of carob flour with and without seeds. The influence of seed presence on physical and chemical properties of flour was also investigated. Seed presence in carob flour led to higher cohesivity and cake strength. It also affected the extraction efficiency of polyphenols, which was confirmed by the ranking of samples according to their procyanidin and tannins contents. With regard to the carbohydrate content, significant differences (P<0.05) between the contents of fructose and glucose was established in samples differing by the presence of carob seeds. Spearman rank order correlations revealed a significant difference (P<0.05) between physical and chemical properties of carob flours. These findings confirm the importance of understanding physical and chemical properties of carob flours in order to use them efficiently as a functional food ingredient. Copyright © 2017 Elsevier Ltd. All rights reserved.

Top