Sample records for sampling strategy based

  1. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    PubMed

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Automated storm water sampling on small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; King, K.W.; Slade, R.M.

    2003-01-01

    Few guidelines are currently available to assist in designing appropriate automated storm water sampling strategies for small watersheds. Therefore, guidance is needed to develop strategies that achieve an appropriate balance between accurate characterization of storm water quality and loads and limitations of budget, equipment, and personnel. In this article, we explore the important sampling strategy components (minimum flow threshold, sampling interval, and discrete versus composite sampling) and project-specific considerations (sampling goal, sampling and analysis resources, and watershed characteristics) based on personal experiences and pertinent field and analytical studies. These components and considerations are important in achieving the balance between sampling goals and limitations because they determine how and when samples are taken and the potential sampling error. Several general recommendations are made, including: setting low minimum flow thresholds, using flow-interval or variable time-interval sampling, and using composite sampling to limit the number of samples collected. Guidelines are presented to aid in selection of an appropriate sampling strategy based on user's project-specific considerations. Our experiences suggest these recommendations should allow implementation of a successful sampling strategy for most small watershed sampling projects with common sampling goals.

  3. Population Pharmacokinetics and Optimal Sampling Strategy for Model-Based Precision Dosing of Melphalan in Patients Undergoing Hematopoietic Stem Cell Transplantation.

    PubMed

    Mizuno, Kana; Dong, Min; Fukuda, Tsuyoshi; Chandra, Sharat; Mehta, Parinda A; McConnell, Scott; Anaissie, Elias J; Vinks, Alexander A

    2018-05-01

    High-dose melphalan is an important component of conditioning regimens for patients undergoing hematopoietic stem cell transplantation. The current dosing strategy based on body surface area results in a high incidence of oral mucositis and gastrointestinal and liver toxicity. Pharmacokinetically guided dosing will individualize exposure and help minimize overexposure-related toxicity. The purpose of this study was to develop a population pharmacokinetic model and optimal sampling strategy. A population pharmacokinetic model was developed with NONMEM using 98 observations collected from 15 adult patients given the standard dose of 140 or 200 mg/m 2 by intravenous infusion. The determinant-optimal sampling strategy was explored with PopED software. Individual area under the curve estimates were generated by Bayesian estimation using full and the proposed sparse sampling data. The predictive performance of the optimal sampling strategy was evaluated based on bias and precision estimates. The feasibility of the optimal sampling strategy was tested using pharmacokinetic data from five pediatric patients. A two-compartment model best described the data. The final model included body weight and creatinine clearance as predictors of clearance. The determinant-optimal sampling strategies (and windows) were identified at 0.08 (0.08-0.19), 0.61 (0.33-0.90), 2.0 (1.3-2.7), and 4.0 (3.6-4.0) h post-infusion. An excellent correlation was observed between area under the curve estimates obtained with the full and the proposed four-sample strategy (R 2  = 0.98; p < 0.01) with a mean bias of -2.2% and precision of 9.4%. A similar relationship was observed in children (R 2  = 0.99; p < 0.01). The developed pharmacokinetic model-based sparse sampling strategy promises to achieve the target area under the curve as part of precision dosing.

  4. Sample allocation balancing overall representativeness and stratum precision.

    PubMed

    Diaz-Quijano, Fredi Alexander

    2018-05-07

    In large-scale surveys, it is often necessary to distribute a preset sample size among a number of strata. Researchers must make a decision between prioritizing overall representativeness or precision of stratum estimates. Hence, I evaluated different sample allocation strategies based on stratum size. The strategies evaluated herein included allocation proportional to stratum population; equal sample for all strata; and proportional to the natural logarithm, cubic root, and square root of the stratum population. This study considered the fact that, from a preset sample size, the dispersion index of stratum sampling fractions is correlated with the population estimator error and the dispersion index of stratum-specific sampling errors would measure the inequality in precision distribution. Identification of a balanced and efficient strategy was based on comparing those both dispersion indices. Balance and efficiency of the strategies changed depending on overall sample size. As the sample to be distributed increased, the most efficient allocation strategies were equal sample for each stratum; proportional to the logarithm, to the cubic root, to square root; and that proportional to the stratum population, respectively. Depending on sample size, each of the strategies evaluated could be considered in optimizing the sample to keep both overall representativeness and stratum-specific precision. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less

  6. Advanced Navigation Strategies for an Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Bauman, J.; Getzandanner, K.; Williams, B.; Williams, K.

    2011-01-01

    The proximity operations phases of a sample return mission to an asteroid have been analyzed using advanced navigation techniques derived from experience gained in planetary exploration. These techniques rely on tracking types such as Earth-based radio metric Doppler and ranging, spacecraft-based ranging, and optical navigation using images of landmarks on the asteroid surface. Navigation strategies for the orbital phases leading up to sample collection, the touch down for collecting the sample, and the post sample collection phase at the asteroid are included. Options for successfully executing the phases are studied using covariance analysis and Monte Carlo simulations of an example mission to the near Earth asteroid 4660 Nereus. Two landing options were studied including trajectories with either one or two bums from orbit to the surface. Additionally, a comparison of post-sample collection strategies is presented. These strategies include remaining in orbit about the asteroid or standing-off a given distance until departure to Earth.

  7. Statistical strategy for inventorying and monitoring the ecosystem resources of the Mexican States of Jalisco and Colima at multiple scales and resolution levels

    Treesearch

    H. T. Schreuder; M. S. Williams; C. Aguirre-Bravo; P. L. Patterson

    2003-01-01

    The sampling strategy is presented for the initial phase of the natural resources pilot project in the Mexican States of Jalisco and Colima. The sampling design used is ground-based cluster sampling with poststratification based on Landsat Thematic Mapper imagery. The data collected will serve as a basis for additional data collection, mapping, and spatial modeling...

  8. MEASUREMENT OF MULTI-POLLUTANT AND MULTI-PATHWAY EXPOSURES IN A PROBABILITY-BASED SAMPLE OF CHILDREN: PRACTICAL STRATEGIES FOR EFFECTIVE FIELD STUDIES

    EPA Science Inventory

    The purpose of this manuscript is to describe the practical strategies developed for the implementation of the Minnesota Children's Pesticide Exposure Study (MNCPES), which is one of the first probability-based samples of multi-pathway and multi-pesticide exposures in children....

  9. Recruiting hard-to-reach United States population sub-groups via adaptations of snowball sampling strategy

    PubMed Central

    Sadler, Georgia Robins; Lee, Hau-Chen; Seung-Hwan Lim, Rod; Fullerton, Judith

    2011-01-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author’s program of research are provided to demonstrate how adaptations of snowball sampling can be effectively used in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or subjects for research studies when recruitment of a population based sample is not essential. PMID:20727089

  10. Recruitment of hard-to-reach population subgroups via adaptations of the snowball sampling strategy.

    PubMed

    Sadler, Georgia Robins; Lee, Hau-Chen; Lim, Rod Seung-Hwan; Fullerton, Judith

    2010-09-01

    Nurse researchers and educators often engage in outreach to narrowly defined populations. This article offers examples of how variations on the snowball sampling recruitment strategy can be applied in the creation of culturally appropriate, community-based information dissemination efforts related to recruitment to health education programs and research studies. Examples from the primary author's program of research are provided to demonstrate how adaptations of snowball sampling can be used effectively in the recruitment of members of traditionally underserved or vulnerable populations. The adaptation of snowball sampling techniques, as described in this article, helped the authors to gain access to each of the more-vulnerable population groups of interest. The use of culturally sensitive recruitment strategies is both appropriate and effective in enlisting the involvement of members of vulnerable populations. Adaptations of snowball sampling strategies should be considered when recruiting participants for education programs or for research studies when the recruitment of a population-based sample is not essential.

  11. Cancer classification through filtering progressive transductive support vector machine based on gene expression data

    NASA Astrophysics Data System (ADS)

    Lu, Xinguo; Chen, Dan

    2017-08-01

    Traditional supervised classifiers neglect a large amount of data which not have sufficient follow-up information, only work with labeled data. Consequently, the small sample size limits the advancement of design appropriate classifier. In this paper, a transductive learning method which combined with the filtering strategy in transductive framework and progressive labeling strategy is addressed. The progressive labeling strategy does not need to consider the distribution of labeled samples to evaluate the distribution of unlabeled samples, can effective solve the problem of evaluate the proportion of positive and negative samples in work set. Our experiment result demonstrate that the proposed technique have great potential in cancer prediction based on gene expression.

  12. Influence of various water quality sampling strategies on load estimates for small streams

    USGS Publications Warehouse

    Robertson, Dale M.; Roerish, Eric D.

    1999-01-01

    Extensive streamflow and water quality data from eight small streams were systematically subsampled to represent various water‐quality sampling strategies. The subsampled data were then used to determine the accuracy and precision of annual load estimates generated by means of a regression approach (typically used for big rivers) and to determine the most effective sampling strategy for small streams. Estimation of annual loads by regression was imprecise regardless of the sampling strategy used; for the most effective strategy, median absolute errors were ∼30% based on the load estimated with an integration method and all available data, if a regression approach is used with daily average streamflow. The most effective sampling strategy depends on the length of the study. For 1‐year studies, fixed‐period monthly sampling supplemented by storm chasing was the most effective strategy. For studies of 2 or more years, fixed‐period semimonthly sampling resulted in not only the least biased but also the most precise loads. Additional high‐flow samples, typically collected to help define the relation between high streamflow and high loads, result in imprecise, overestimated annual loads if these samples are consistently collected early in high‐flow events.

  13. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water

    PubMed Central

    Besmer, Michael D.; Hammes, Frederik; Sigrist, Jürg A.; Ort, Christoph

    2017-01-01

    Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly). This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days), for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM). We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1) analyze any relevant event and (2) limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their current monitoring strategies. PMID:29213255

  14. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water.

    PubMed

    Besmer, Michael D; Hammes, Frederik; Sigrist, Jürg A; Ort, Christoph

    2017-01-01

    Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly). This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days), for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM). We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1) analyze any relevant event and (2) limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their current monitoring strategies.

  15. Case-based fracture image retrieval.

    PubMed

    Zhou, Xin; Stern, Richard; Müller, Henning

    2012-05-01

    Case-based fracture image retrieval can assist surgeons in decisions regarding new cases by supplying visually similar past cases. This tool may guide fracture fixation and management through comparison of long-term outcomes in similar cases. A fracture image database collected over 10 years at the orthopedic service of the University Hospitals of Geneva was used. This database contains 2,690 fracture cases associated with 43 classes (based on the AO/OTA classification). A case-based retrieval engine was developed and evaluated using retrieval precision as a performance metric. Only cases in the same class as the query case are considered as relevant. The scale-invariant feature transform (SIFT) is used for image analysis. Performance evaluation was computed in terms of mean average precision (MAP) and early precision (P10, P30). Retrieval results produced with the GNU image finding tool (GIFT) were used as a baseline. Two sampling strategies were evaluated. One used a dense 40 × 40 pixel grid sampling, and the second one used the standard SIFT features. Based on dense pixel grid sampling, three unsupervised feature selection strategies were introduced to further improve retrieval performance. With dense pixel grid sampling, the image is divided into 1,600 (40 × 40) square blocks. The goal is to emphasize the salient regions (blocks) and ignore irrelevant regions. Regions are considered as important when a high variance of the visual features is found. The first strategy is to calculate the variance of all descriptors on the global database. The second strategy is to calculate the variance of all descriptors for each case. A third strategy is to perform a thumbnail image clustering in a first step and then to calculate the variance for each cluster. Finally, a fusion between a SIFT-based system and GIFT is performed. A first comparison on the selection of sampling strategies using SIFT features shows that dense sampling using a pixel grid (MAP = 0.18) outperformed the SIFT detector-based sampling approach (MAP = 0.10). In a second step, three unsupervised feature selection strategies were evaluated. A grid parameter search is applied to optimize parameters for feature selection and clustering. Results show that using half of the regions (700 or 800) obtains the best performance for all three strategies. Increasing the number of clusters in clustering can also improve the retrieval performance. The SIFT descriptor variance in each case gave the best indication of saliency for the regions (MAP = 0.23), better than the other two strategies (MAP = 0.20 and 0.21). Combining GIFT (MAP = 0.23) and the best SIFT strategy (MAP = 0.23) produced significantly better results (MAP = 0.27) than each system alone. A case-based fracture retrieval engine was developed and is available for online demonstration. SIFT is used to extract local features, and three feature selection strategies were introduced and evaluated. A baseline using the GIFT system was used to evaluate the salient point-based approaches. Without supervised learning, SIFT-based systems with optimized parameters slightly outperformed the GIFT system. A fusion of the two approaches shows that the information contained in the two approaches is complementary. Supervised learning on the feature space is foreseen as the next step of this study.

  16. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards.

    PubMed

    Bornstein, Marc H; Jager, Justin; Putnick, Diane L

    2013-12-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study's target population, whether they yield representative and generalizable estimates of subsamples within a study's target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce "noise" related to variation in subsamples and whether that "noise" can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting.

  17. Sampling in Developmental Science: Situations, Shortcomings, Solutions, and Standards

    PubMed Central

    Bornstein, Marc H.; Jager, Justin; Putnick, Diane L.

    2014-01-01

    Sampling is a key feature of every study in developmental science. Although sampling has far-reaching implications, too little attention is paid to sampling. Here, we describe, discuss, and evaluate four prominent sampling strategies in developmental science: population-based probability sampling, convenience sampling, quota sampling, and homogeneous sampling. We then judge these sampling strategies by five criteria: whether they yield representative and generalizable estimates of a study’s target population, whether they yield representative and generalizable estimates of subsamples within a study’s target population, the recruitment efforts and costs they entail, whether they yield sufficient power to detect subsample differences, and whether they introduce “noise” related to variation in subsamples and whether that “noise” can be accounted for statistically. We use sample composition of gender, ethnicity, and socioeconomic status to illustrate and assess the four sampling strategies. Finally, we tally the use of the four sampling strategies in five prominent developmental science journals and make recommendations about best practices for sample selection and reporting. PMID:25580049

  18. Strategies to address participant misrepresentation for eligibility in Web-based research.

    PubMed

    Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark

    2014-03-01

    Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.

  19. A novel 3D Cartesian random sampling strategy for Compressive Sensing Magnetic Resonance Imaging.

    PubMed

    Valvano, Giuseppe; Martini, Nicola; Santarelli, Maria Filomena; Chiappino, Dante; Landini, Luigi

    2015-01-01

    In this work we propose a novel acquisition strategy for accelerated 3D Compressive Sensing Magnetic Resonance Imaging (CS-MRI). This strategy is based on a 3D cartesian sampling with random switching of the frequency encoding direction with other K-space directions. Two 3D sampling strategies are presented. In the first strategy, the frequency encoding direction is randomly switched with one of the two phase encoding directions. In the second strategy, the frequency encoding direction is randomly chosen between all the directions of the K-Space. These strategies can lower the coherence of the acquisition, in order to produce reduced aliasing artifacts and to achieve a better image quality after Compressive Sensing (CS) reconstruction. Furthermore, the proposed strategies can reduce the typical smoothing of CS due to the limited sampling of high frequency locations. We demonstrated by means of simulations that the proposed acquisition strategies outperformed the standard Compressive Sensing acquisition. This results in a better quality of the reconstructed images and in a greater achievable acceleration.

  20. Nonprobability and probability-based sampling strategies in sexual science.

    PubMed

    Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah

    2015-01-01

    With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.

  1. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    PubMed

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  2. Random forests ensemble classifier trained with data resampling strategy to improve cardiac arrhythmia diagnosis.

    PubMed

    Ozçift, Akin

    2011-05-01

    Supervised classification algorithms are commonly used in the designing of computer-aided diagnosis systems. In this study, we present a resampling strategy based Random Forests (RF) ensemble classifier to improve diagnosis of cardiac arrhythmia. Random forests is an ensemble classifier that consists of many decision trees and outputs the class that is the mode of the class's output by individual trees. In this way, an RF ensemble classifier performs better than a single tree from classification performance point of view. In general, multiclass datasets having unbalanced distribution of sample sizes are difficult to analyze in terms of class discrimination. Cardiac arrhythmia is such a dataset that has multiple classes with small sample sizes and it is therefore adequate to test our resampling based training strategy. The dataset contains 452 samples in fourteen types of arrhythmias and eleven of these classes have sample sizes less than 15. Our diagnosis strategy consists of two parts: (i) a correlation based feature selection algorithm is used to select relevant features from cardiac arrhythmia dataset. (ii) RF machine learning algorithm is used to evaluate the performance of selected features with and without simple random sampling to evaluate the efficiency of proposed training strategy. The resultant accuracy of the classifier is found to be 90.0% and this is a quite high diagnosis performance for cardiac arrhythmia. Furthermore, three case studies, i.e., thyroid, cardiotocography and audiology, are used to benchmark the effectiveness of the proposed method. The results of experiments demonstrated the efficiency of random sampling strategy in training RF ensemble classification algorithm. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. A model-based 'varimax' sampling strategy for a heterogeneous population.

    PubMed

    Akram, Nuzhat A; Farooqi, Shakeel R

    2014-01-01

    Sampling strategies are planned to enhance the homogeneity of a sample, hence to minimize confounding errors. A sampling strategy was developed to minimize the variation within population groups. Karachi, the largest urban agglomeration in Pakistan, was used as a model population. Blood groups ABO and Rh factor were determined for 3000 unrelated individuals selected through simple random sampling. Among them five population groups, namely Balochi, Muhajir, Pathan, Punjabi and Sindhi, based on paternal ethnicity were identified. An index was designed to measure the proportion of admixture at parental and grandparental levels. Population models based on index score were proposed. For validation, 175 individuals selected through stratified random sampling were genotyped for the three STR loci CSF1PO, TPOX and TH01. ANOVA showed significant differences across the population groups for blood groups and STR loci distribution. Gene diversity was higher across the sub-population model than in the agglomerated population. At parental level gene diversities are significantly higher across No admixture models than Admixture models. At grandparental level the difference was not significant. A sub-population model with no admixture at parental level was justified for sampling the heterogeneous population of Karachi.

  4. Sample injection and electrophoretic separation on a simple laminated paper based analytical device.

    PubMed

    Xu, Chunxiu; Zhong, Minghua; Cai, Longfei; Zheng, Qingyu; Zhang, Xiaojun

    2016-02-01

    We described a strategy to perform multistep operations on a simple laminated paper-based separation device by using electrokinetic flow to manipulate the fluids. A laminated crossed-channel paper-based separation device was fabricated by cutting a filter paper sheet followed by lamination. Multiple function units including sample loading, sample injection, and electrophoretic separation were integrated on a single paper based analytical device for the first time, by applying potential at different reservoirs for sample, sample waste, buffer, and buffer waste. As a proof-of-concept demonstration, mixed sample solution containing carmine and sunset yellow were loaded in the sampling channel, and then injected into separation channel followed by electrophoretic separation, by adjusting the potentials applied at the four terminals of sampling and separation channel. The effects of buffer pH, buffer concentration, channel width, and separation time on resolution of electrophoretic separation were studied. This strategy may be used to perform multistep operations such as reagent dilution, sample injection, mixing, reaction, and separation on a single microfluidic paper based analytical device, which is very attractive for building micro total analysis systems on microfluidic paper based analytical devices. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  6. Site Plan Safety Submission for Sampling, Monitoring, and Decontamination of Mustard Agent - South Plant, Rocky Mountain Arsenal. Volume 1

    DTIC Science & Technology

    1988-10-01

    sample these ducts. This judgement was based on the following factors : 1. The ducts were open to the atmosphere. 2. RMA records of building area samples...selected based on several factors including piping arrangements, volume to be sampled, sampling equipment flow rates, and the flow rate necessary for...effective sampling. Therefore, each sampling point strategy and procedure was customized based on these factors . The individual specific sampling

  7. Research and Engagement Strategies for Young Adult Immigrants Without Documentation: Lessons Learned Through Community Partnership.

    PubMed

    Raymond-Flesch, Marissa; Siemons, Rachel; Brindis, Claire D

    2016-01-01

    Limited research has focused on undocumented immigrants' health and access to care. This paper describes participant engagement strategies used to investigate the health needs of immigrants eligible for Deferred Action for Childhood Arrivals (DACA). Community-based strategies engaged advocates and undocumented Californians in study design and recruitment. Outreach in diverse settings, social media, and participant-driven sampling recruited 61 DACA-eligible focus group participants. Social media, community-based organizations (CBOs), family members, advocacy groups, and participant-driven sampling were the most successful recruitment strategies. Participants felt engaging in research was instrumental for sharing their concerns with health care providers and policymakers, noteworthy in light of their previously identified fears and mistrust of government officials. Using multiple culturally responsive strategies including participant-driven sampling, engagement with CBOs, and use of social media, those eligible for DACA eagerly engage as research participants. Educating researchers and institutional review boards (IRBs) about legal and safety concerns can improve research engagement.

  8. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    USDA-ARS?s Scientific Manuscript database

    Cumulative nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. This study used an agroecosystems simulation model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2...

  9. Chapter 2: Sampling strategies in forest hydrology and biogeochemistry

    Treesearch

    Roger C. Bales; Martha H. Conklin; Branko Kerkez; Steven Glaser; Jan W. Hopmans; Carolyn T. Hunsaker; Matt Meadows; Peter C. Hartsough

    2011-01-01

    Many aspects of forest hydrology have been based on accurate but not necessarily spatially representative measurements, reflecting the measurement capabilities that were traditionally available. Two developments are bringing about fundamental changes in sampling strategies in forest hydrology and biogeochemistry: (a) technical advances in measurement capability, as is...

  10. Accuracy and Cost-Effectiveness of Cervical Cancer Screening by High-Risk HPV DNA Testing of Self-Collected Vaginal Samples

    PubMed Central

    Balasubramanian, Akhila; Kulasingam, Shalini L.; Baer, Atar; Hughes, James P.; Myers, Evan R.; Mao, Constance; Kiviat, Nancy B.; Koutsky, Laura A.

    2010-01-01

    Objective Estimate the accuracy and cost-effectiveness of cervical cancer screening strategies based on high-risk HPV DNA testing of self-collected vaginal samples. Materials and Methods A subset of 1,665 women (18-50 years of age) participating in a cervical cancer screening study were screened by liquid-based cytology and by high-risk HPV DNA testing of both self-collected vaginal swab samples and clinician-collected cervical samples. Women with positive/abnormal screening test results and a subset of women with negative screening test results were triaged to colposcopy. Based on individual and combined test results, five screening strategies were defined. Estimates of sensitivity and specificity for cervical intraepithelial neoplasia grade 2 or worse were calculated and a Markov model was used to estimate the incremental cost-effectiveness ratios (ICERs) for each strategy. Results Compared to cytology-based screening, high-risk HPV DNA testing of self-collected vaginal samples was more sensitive (68%, 95%CI=58%-78% versus 85%, 95%CI=76%-94%) but less specific (89%, 95%CI=86%-91% versus 73%, 95%CI=67%-79%). A strategy of high-risk HPV DNA testing of self-collected vaginal samples followed by cytology triage of HPV positive women, was comparably sensitive (75%, 95%CI=64%-86%) and specific (88%, 95%CI=85%-92%) to cytology-based screening. In-home self-collection for high-risk HPV DNA detection followed by in-clinic cytology triage had a slightly lower lifetime cost and a slightly higher quality-adjusted life expectancy than did cytology-based screening (ICER of triennial screening compared to no screening was $9,871/QALY and $12,878/QALY, respectively). Conclusions Triennial screening by high-risk HPV DNA testing of in-home, self-collected vaginal samples followed by in-clinic cytology triage was cost-effective. PMID:20592553

  11. Diagnostic test accuracy and prevalence inferences based on joint and sequential testing with finite population sampling.

    PubMed

    Su, Chun-Lung; Gardner, Ian A; Johnson, Wesley O

    2004-07-30

    The two-test two-population model, originally formulated by Hui and Walter, for estimation of test accuracy and prevalence estimation assumes conditionally independent tests, constant accuracy across populations and binomial sampling. The binomial assumption is incorrect if all individuals in a population e.g. child-care centre, village in Africa, or a cattle herd are sampled or if the sample size is large relative to population size. In this paper, we develop statistical methods for evaluating diagnostic test accuracy and prevalence estimation based on finite sample data in the absence of a gold standard. Moreover, two tests are often applied simultaneously for the purpose of obtaining a 'joint' testing strategy that has either higher overall sensitivity or specificity than either of the two tests considered singly. Sequential versions of such strategies are often applied in order to reduce the cost of testing. We thus discuss joint (simultaneous and sequential) testing strategies and inference for them. Using the developed methods, we analyse two real and one simulated data sets, and we compare 'hypergeometric' and 'binomial-based' inferences. Our findings indicate that the posterior standard deviations for prevalence (but not sensitivity and specificity) based on finite population sampling tend to be smaller than their counterparts for infinite population sampling. Finally, we make recommendations about how small the sample size should be relative to the population size to warrant use of the binomial model for prevalence estimation. Copyright 2004 John Wiley & Sons, Ltd.

  12. Statistical Strategy for Inventorying and Monitoring the Ecosystem Resources of the State of Jalisco at Multiple Scales and Resolution Levels

    Treesearch

    Robin M. Reich; Hans T. Schreuder

    2006-01-01

    The sampling strategy involving both statistical and in-place inventory information is presented for the natural resources project of the Green Belt area (Centuron Verde) in the Mexican state of Jalisco. The sampling designs used were a grid based ground sample of a 90x90 m plot and a two-stage stratified sample of 30 x 30 m plots. The data collected were used to...

  13. The use of a genetic algorithm-based search strategy in geostatistics: application to a set of anisotropic piezometric head data

    NASA Astrophysics Data System (ADS)

    Abedini, M. J.; Nasseri, M.; Burn, D. H.

    2012-04-01

    In any geostatistical study, an important consideration is the choice of an appropriate, repeatable, and objective search strategy that controls the nearby samples to be included in the location-specific estimation procedure. Almost all geostatistical software available in the market puts the onus on the user to supply search strategy parameters in a heuristic manner. These parameters are solely controlled by geographical coordinates that are defined for the entire area under study, and the user has no guidance as to how to choose these parameters. The main thesis of the current study is that the selection of search strategy parameters has to be driven by data—both the spatial coordinates and the sample values—and cannot be chosen beforehand. For this purpose, a genetic-algorithm-based ordinary kriging with moving neighborhood technique is proposed. The search capability of a genetic algorithm is exploited to search the feature space for appropriate, either local or global, search strategy parameters. Radius of circle/sphere and/or radii of standard or rotated ellipse/ellipsoid are considered as the decision variables to be optimized by GA. The superiority of GA-based ordinary kriging is demonstrated through application to the Wolfcamp Aquifer piezometric head data. Assessment of numerical results showed that definition of search strategy parameters based on both geographical coordinates and sample values improves cross-validation statistics when compared with that based on geographical coordinates alone. In the case of a variable search neighborhood for each estimation point, optimization of local search strategy parameters for an elliptical support domain—the orientation of which is dictated by anisotropic axes—via GA was able to capture the dynamics of piezometric head in west Texas/New Mexico in an efficient way.

  14. Consistently Sampled Correlation Filters with Space Anisotropic Regularization for Visual Tracking

    PubMed Central

    Shi, Guokai; Xu, Tingfa; Luo, Jiqiang; Li, Yuankun

    2017-01-01

    Most existing correlation filter-based tracking algorithms, which use fixed patches and cyclic shifts as training and detection measures, assume that the training samples are reliable and ignore the inconsistencies between training samples and detection samples. We propose to construct and study a consistently sampled correlation filter with space anisotropic regularization (CSSAR) to solve these two problems simultaneously. Our approach constructs a spatiotemporally consistent sample strategy to alleviate the redundancies in training samples caused by the cyclical shifts, eliminate the inconsistencies between training samples and detection samples, and introduce space anisotropic regularization to constrain the correlation filter for alleviating drift caused by occlusion. Moreover, an optimization strategy based on the Gauss-Seidel method was developed for obtaining robust and efficient online learning. Both qualitative and quantitative evaluations demonstrate that our tracker outperforms state-of-the-art trackers in object tracking benchmarks (OTBs). PMID:29231876

  15. Comparison of non-landslide sampling strategies to counteract inventory-based biases within national-scale statistical landslide susceptibility models

    NASA Astrophysics Data System (ADS)

    Lima, Pedro; Steger, Stefan; Glade, Thomas

    2017-04-01

    Landslides can represent a significant threat for people and infrastructure in hilly and mountainous landscapes worldwide. The understanding and prediction of those geomorphic processes is crucial to avoid economic loses or even casualties to people and their properties. Statistical based landslide susceptibility models are well known for being highly reliant on the quality, representativeness and availability of input data. In this context, several studies indicate that the landslide inventory represents the most important input data. However each landslide mapping technique or data collection has its drawbacks. Consequently, biased landslide inventories may be commonly introduced into statistical models, especially at regional or even national scale. It remains to the researcher to be aware of potential limitations and design strategies to avoid or reduce the potential propagation of input data errors and biases influences on the modelling outcomes. Previous studies have proven that such erroneous landslide inventories may lead to unrealistic landslide susceptibility maps. We assume that one possibility to tackle systematic landslide inventory-based biases might be a concentration on sampling strategies that focus on the distribution of non-landslide locations. For this purpose, we test an approach for the Austrian territory that concentrates on a modified non-landslide sampling strategy, instead the traditional applied random sampling. It is expected that the way non-landslide locations are represented (e.g. equally over the area or within those areas where mapping campaigns have been conducted) is important to reduce a potential over- or underestimation of landslide susceptibility within specific areas caused by bias. As presumably each landslide inventory is known to be systematically incomplete, especially in those areas where no mapping campaign was previously conducted. This is also applicable to the one currently available for the Austrian territory, composed by 14,519 shallow landslides. Within this study, we introduce the following explanatory variables to test the effect of different non-landslide strategies: Lithological units, grouped by their geotechnical properties and topographic parameters such as aspect, elevation, slope gradient and the topographic position. Landslide susceptibility maps will be derived by applying logistic regression, while systematic comparisons will be carried out based on models created by different non-landslide sampling strategies. Models generated by the conventional random sampling are presented against models based on stratified and clustered sampling strategies. The modelling results will be compared in terms of their prediction performance measured by the AUROC (Area Under the Receiver Operating Characteristic Curve) obtained by means of a k-fold cross-validation and also by the spatial pattern of the maps. The outcomes of this study are intended to contribute to the understanding on how landslide-inventory based biases may be counteracted.

  16. A multiple-objective optimal exploration strategy

    USGS Publications Warehouse

    Christakos, G.; Olea, R.A.

    1988-01-01

    Exploration for natural resources is accomplished through partial sampling of extensive domains. Such imperfect knowledge is subject to sampling error. Complex systems of equations resulting from modelling based on the theory of correlated random fields are reduced to simple analytical expressions providing global indices of estimation variance. The indices are utilized by multiple objective decision criteria to find the best sampling strategies. The approach is not limited by geometric nature of the sampling, covers a wide range in spatial continuity and leads to a step-by-step procedure. ?? 1988.

  17. On the Application of Different Event-Based Sampling Strategies to the Control of a Simple Industrial Process

    PubMed Central

    Sánchez, José; Guarnes, Miguel Ángel; Dormido, Sebastián

    2009-01-01

    This paper is an experimental study of the utilization of different event-based strategies for the automatic control of a simple but very representative industrial process: the level control of a tank. In an event-based control approach it is the triggering of a specific event, and not the time, that instructs the sensor to send the current state of the process to the controller, and the controller to compute a new control action and send it to the actuator. In the document, five control strategies based on different event-based sampling techniques are described, compared, and contrasted with a classical time-based control approach and a hybrid one. The common denominator in the time, the hybrid, and the event-based control approaches is the controller: a proportional-integral algorithm with adaptations depending on the selected control approach. To compare and contrast each one of the hybrid and the pure event-based control algorithms with the time-based counterpart, the two tasks that a control strategy must achieve (set-point following and disturbance rejection) are independently analyzed. The experimental study provides new proof concerning the ability of event-based control strategies to minimize the data exchange among the control agents (sensors, controllers, actuators) when an error-free control of the process is not a hard requirement. PMID:22399975

  18. A two-stage cluster sampling method using gridded population data, a GIS, and Google Earth(TM) imagery in a population-based mortality survey in Iraq.

    PubMed

    Galway, Lp; Bell, Nathaniel; Sae, Al Shatari; Hagopian, Amy; Burnham, Gilbert; Flaxman, Abraham; Weiss, Wiliam M; Rajaratnam, Julie; Takaro, Tim K

    2012-04-27

    Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  19. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    PubMed Central

    2012-01-01

    Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings. PMID:22540266

  20. Mendelian breeding units versus standard sampling strategies: Mitochondrial DNA variation in southwest Sardinia

    PubMed Central

    Sanna, Daria; Pala, Maria; Cossu, Piero; Dedola, Gian Luca; Melis, Sonia; Fresu, Giovanni; Morelli, Laura; Obinu, Domenica; Tonolo, Giancarlo; Secchi, Giannina; Triunfo, Riccardo; Lorenz, Joseph G.; Scheinfeldt, Laura; Torroni, Antonio; Robledo, Renato; Francalacci, Paolo

    2011-01-01

    We report a sampling strategy based on Mendelian Breeding Units (MBUs), representing an interbreeding group of individuals sharing a common gene pool. The identification of MBUs is crucial for case-control experimental design in association studies. The aim of this work was to evaluate the possible existence of bias in terms of genetic variability and haplogroup frequencies in the MBU sample, due to severe sample selection. In order to reach this goal, the MBU sampling strategy was compared to a standard selection of individuals according to their surname and place of birth. We analysed mitochondrial DNA variation (first hypervariable segment and coding region) in unrelated healthy subjects from two different areas of Sardinia: the area around the town of Cabras and the western Campidano area. No statistically significant differences were observed when the two sampling methods were compared, indicating that the stringent sample selection needed to establish a MBU does not alter original genetic variability and haplogroup distribution. Therefore, the MBU sampling strategy can be considered a useful tool in association studies of complex traits. PMID:21734814

  1. Comparison of sampling strategies for object-based classification of urban vegetation from Very High Resolution satellite images

    NASA Astrophysics Data System (ADS)

    Rougier, Simon; Puissant, Anne; Stumpf, André; Lachiche, Nicolas

    2016-09-01

    Vegetation monitoring is becoming a major issue in the urban environment due to the services they procure and necessitates an accurate and up to date mapping. Very High Resolution satellite images enable a detailed mapping of the urban tree and herbaceous vegetation. Several supervised classifications with statistical learning techniques have provided good results for the detection of urban vegetation but necessitate a large amount of training data. In this context, this study proposes to investigate the performances of different sampling strategies in order to reduce the number of examples needed. Two windows based active learning algorithms from state-of-art are compared to a classical stratified random sampling and a third combining active learning and stratified strategies is proposed. The efficiency of these strategies is evaluated on two medium size French cities, Strasbourg and Rennes, associated to different datasets. Results demonstrate that classical stratified random sampling can in some cases be just as effective as active learning methods and that it should be used more frequently to evaluate new active learning methods. Moreover, the active learning strategies proposed in this work enables to reduce the computational runtime by selecting multiple windows at each iteration without increasing the number of windows needed.

  2. Strategies for de-identification and anonymization of electronic health record data for use in multicenter research studies.

    PubMed

    Kushida, Clete A; Nichols, Deborah A; Jadrnicek, Rik; Miller, Ric; Walsh, James K; Griffin, Kara

    2012-07-01

    De-identification and anonymization are strategies that are used to remove patient identifiers in electronic health record data. The use of these strategies in multicenter research studies is paramount in importance, given the need to share electronic health record data across multiple environments and institutions while safeguarding patient privacy. Systematic literature search using keywords of de-identify, deidentify, de-identification, deidentification, anonymize, anonymization, data scrubbing, and text scrubbing. Search was conducted up to June 30, 2011 and involved 6 different common literature databases. A total of 1798 prospective citations were identified, and 94 full-text articles met the criteria for review and the corresponding articles were obtained. Search results were supplemented by review of 26 additional full-text articles; a total of 120 full-text articles were reviewed. A final sample of 45 articles met inclusion criteria for review and discussion. Articles were grouped into text, images, and biological sample categories. For text-based strategies, the approaches were segregated into heuristic, lexical, and pattern-based systems versus statistical learning-based systems. For images, approaches that de-identified photographic facial images and magnetic resonance image data were described. For biological samples, approaches that managed the identifiers linked with these samples were discussed, particularly with respect to meeting the anonymization requirements needed for Institutional Review Board exemption under the Common Rule. Current de-identification strategies have their limitations, and statistical learning-based systems have distinct advantages over other approaches for the de-identification of free text. True anonymization is challenging, and further work is needed in the areas of de-identification of datasets and protection of genetic information.

  3. Recruitment strategies in a prospective longitudinal family study on parents with obesity and their toddlers.

    PubMed

    Bergmann, Sarah; Keitel-Korndörfer, Anja; Herfurth-Majstorovic, Katharina; Wendt, Verena; Klein, Annette M; von Klitzing, Kai; Grube, Matthias

    2017-02-01

    Recruitment of participants with obesity is a real challenge. To reduce time and costs in similar projects, we investigated various recruiting strategies used in a longitudinal family study with respect to their enrolment yield and cost effectiveness. Results may help other research groups to optimize their recruitment strategies. We applied different recruitment strategies to acquire families with children aged 6 to 47 months and at least one parent with obesity (risk group) or two parents of normal weight (control group) for a longitudinal non-interventional study. Based on four main strategies-via media, kindergartens, health professionals and focusing on the community-we examined 15 different subcategories of strategies. Based on enrolment yield and relative costs (e.g., material expenses, staff time) we analyzed the effectiveness of each recruitment strategy. Following different recruitment approaches, 685 families contacted us; 26% (n = 178) of these met the inclusion criteria. Of the four main strategies, the community-focused strategy was the most successful one (accounting for 36.5% of the sample) followed by contacts with kindergartens (accounting for 28.1% of the sample). Of the subcategories, two strategies were outstanding: Posters (community-focused strategies), and recruitment via kindergartens using phone contacts rather than emailing. Only a small number of participants were recruited via announcements in newspapers (lower cost strategy), advertisements on public transport or face-to-face recruitment at various places (higher cost strategies). Results revealed that only a combination of different active and passive methods and approaches led to a sufficient sample size. In this study, recruitment via posters and contacting kindergartens on the phone produced the highest numbers of participants (high enrolment yield) at moderate costs.

  4. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    Treesearch

    Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes

    2017-01-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...

  5. Sampling designs for contaminant temporal trend analyses using sedentary species exemplified by the snails Bellamya aeruginosa and Viviparus viviparus.

    PubMed

    Yin, Ge; Danielsson, Sara; Dahlberg, Anna-Karin; Zhou, Yihui; Qiu, Yanling; Nyberg, Elisabeth; Bignert, Anders

    2017-10-01

    Environmental monitoring typically assumes samples and sampling activities to be representative of the population being studied. Given a limited budget, an appropriate sampling strategy is essential to support detecting temporal trends of contaminants. In the present study, based on real chemical analysis data on polybrominated diphenyl ethers in snails collected from five subsites in Tianmu Lake, computer simulation is performed to evaluate three sampling strategies by the estimation of required sample size, to reach a detection of an annual change of 5% with a statistical power of 80% and 90% with a significant level of 5%. The results showed that sampling from an arbitrarily selected sampling spot is the worst strategy, requiring much more individual analyses to achieve the above mentioned criteria compared with the other two approaches. A fixed sampling site requires the lowest sample size but may not be representative for the intended study object e.g. a lake and is also sensitive to changes of that particular sampling site. In contrast, sampling at multiple sites along the shore each year, and using pooled samples when the cost to collect and prepare individual specimens are much lower than the cost for chemical analysis, would be the most robust and cost efficient strategy in the long run. Using statistical power as criterion, the results demonstrated quantitatively the consequences of various sampling strategies, and could guide users with respect of required sample sizes depending on sampling design for long term monitoring programs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Examination of instructional strategies: Secondary science teachers of mainstreamed English language learners in two high schools in southern New England

    NASA Astrophysics Data System (ADS)

    Yangambi, Matthieu Wakalewae

    2005-12-01

    Increasingly, English Language Learners (ELLs) are mainstreamed in science classes. As a result, science teachers must assume responsibility for these students' education. Currently, state tests show a wide performance gap between ELLs and non-ELLs in science and other content area courses. For instance, the Massachusetts Comprehensive Assessment System (MCAS) shows a two years average performance of 6% for ELLs and 33% for non-ELLs in English Language Arts (ELA), Mathematics, and Science and Technology, a 27% performance gap (Lachat, 2000). The use of research based effective teaching strategies for ELLs is indispensable in order to meet ELLs' learning needs (Jarret, 1999). The purpose of this study was to determine if differences exist between ELLs and non-ELLs regarding instructional strategies that secondary science teachers employ. Four areas were examined: instructional strategies mainstreamed ELLs and non-ELLs report as being most frequently employed by their science teachers, instructional strategies ELLs and non-ELLs consider most effective in their learning, the existing differences between ELLs and non-ELLs in the rating of effectiveness of instructional strategies their teachers currently practice, and factors impacting ELLs and non-ELLs' performance on high-stakes tests. This study was conducted in two urban high schools in Southern New England. The sample (N = 71) was based on the non-probability sampling technique known as convenience sampling from students registered in science classes. The questionnaire was designed based on research-based effective teaching strategies (Burnette, 1999; Ortiz, 1997), using a Likert-type scale. Several findings were of importance. First, ELLs and non-ELLs reported similar frequency of use of effective instructional strategies by teachers. However, ELLs and non-ELLs identified different preferences for strategies. Whereas non-ELLs preferred connecting learning to real life situations, ELLs rated that strategy as least effective. The results of this study may inform education policy makers and school systems about instructional strategies to implement in classrooms in order to meet the learning needs of every student. Recommendations for practice are included.

  7. Visual and portable strategy for copper(II) detection based on a striplike poly(thymine)-caged and microwell-printed hydrogel.

    PubMed

    Qing, Zhihe; Mao, Zhengui; Qing, Taiping; He, Xiaoxiao; Zou, Zhen; He, Dinggeng; Shi, Hui; Huang, Jin; Liu, Jianbo; Wang, Kemin

    2014-11-18

    Due to its importance to develop strategies for copper(II) (Cu(2+)) detection, we here report a visual and portable strategy for Cu(2+) detection based on designing and using a strip-like hydrogel. The hydrogel is functionalized through caging poly(thymine) as probes, which can effectively template the formation of fluorescent copper nanoparticles (CuNPs) in the presence of the reductant (ascorbate) and Cu(2+). On the hydrogel's surface, uniform wells of microliter volume (microwells) are printed for sample-injection. When the injected sample is stained by Cu(2+), fluorescent CuNPs will be in situ templated by poly T in the hydrogel. With ultraviolet (UV) irradiation, the red fluorescence of CuNPs can be observed by naked-eye and recorded by a common camera without complicated instruments. Thus, the strategy integrates sample-injection, reaction and indication with fast signal response, providing an add-and-read manner for visual and portable detection of Cu(2+), as well as a strip-like strategy. Detection ability with a detectable minimum concentration of 20 μM and practically applicable properties have been demonstrated, such as resistance to environmental interference and good constancy, indicating that the strategy holds great potential and significance for popular detection of Cu(2+), especially in remote regions. We believe that the strip-like hydrogel-based methodology is also applicable to other targets by virtue of altering probes.

  8. Using pilot data to size a two-arm randomized trial to find a nearly optimal personalized treatment strategy.

    PubMed

    Laber, Eric B; Zhao, Ying-Qi; Regh, Todd; Davidian, Marie; Tsiatis, Anastasios; Stanford, Joseph B; Zeng, Donglin; Song, Rui; Kosorok, Michael R

    2016-04-15

    A personalized treatment strategy formalizes evidence-based treatment selection by mapping patient information to a recommended treatment. Personalized treatment strategies can produce better patient outcomes while reducing cost and treatment burden. Thus, among clinical and intervention scientists, there is a growing interest in conducting randomized clinical trials when one of the primary aims is estimation of a personalized treatment strategy. However, at present, there are no appropriate sample size formulae to assist in the design of such a trial. Furthermore, because the sampling distribution of the estimated outcome under an estimated optimal treatment strategy can be highly sensitive to small perturbations in the underlying generative model, sample size calculations based on standard (uncorrected) asymptotic approximations or computer simulations may not be reliable. We offer a simple and robust method for powering a single stage, two-armed randomized clinical trial when the primary aim is estimating the optimal single stage personalized treatment strategy. The proposed method is based on inverting a plugin projection confidence interval and is thereby regular and robust to small perturbations of the underlying generative model. The proposed method requires elicitation of two clinically meaningful parameters from clinical scientists and uses data from a small pilot study to estimate nuisance parameters, which are not easily elicited. The method performs well in simulated experiments and is illustrated using data from a pilot study of time to conception and fertility awareness. Copyright © 2015 John Wiley & Sons, Ltd.

  9. A comparison of temporal and location-based sampling strategies for global positioning system-triggered electronic diaries.

    PubMed

    Törnros, Tobias; Dorn, Helen; Reichert, Markus; Ebner-Priemer, Ulrich; Salize, Hans-Joachim; Tost, Heike; Meyer-Lindenberg, Andreas; Zipf, Alexander

    2016-11-21

    Self-reporting is a well-established approach within the medical and psychological sciences. In order to avoid recall bias, i.e. past events being remembered inaccurately, the reports can be filled out on a smartphone in real-time and in the natural environment. This is often referred to as ambulatory assessment and the reports are usually triggered at regular time intervals. With this sampling scheme, however, rare events (e.g. a visit to a park or recreation area) are likely to be missed. When addressing the correlation between mood and the environment, it may therefore be beneficial to include participant locations within the ambulatory assessment sampling scheme. Based on the geographical coordinates, the database query system then decides if a self-report should be triggered or not. We simulated four different ambulatory assessment sampling schemes based on movement data (coordinates by minute) from 143 voluntary participants tracked for seven consecutive days. Two location-based sampling schemes incorporating the environmental characteristics (land use and population density) at each participant's location were introduced and compared to a time-based sampling scheme triggering a report on the hour as well as to a sampling scheme incorporating physical activity. We show that location-based sampling schemes trigger a report less often, but we obtain more unique trigger positions and a greater spatial spread in comparison to sampling strategies based on time and distance. Additionally, the location-based methods trigger significantly more often at rarely visited types of land use and less often outside the study region where no underlying environmental data are available.

  10. Optimal sampling strategies for detecting zoonotic disease epidemics.

    PubMed

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  11. Learning Efficiency of Two ICT-Based Instructional Strategies in Greek Sheep Farmers

    ERIC Educational Resources Information Center

    Bellos, Georgios; Mikropoulos, Tassos A.; Deligeorgis, Stylianos; Kominakis, Antonis

    2016-01-01

    Purpose: The objective of the present study was to compare the learning efficiency of two information and communications technology (ICT)-based instructional strategies (multimedia presentation (MP) and concept mapping) in a sample (n = 187) of Greek sheep farmers operating mainly in Western Greece. Design/methodology/approach: In total, 15…

  12. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ{sub 1}-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence onmore » the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.« less

  13. Quantifying Antimicrobial Resistance at Veal Calf Farms

    PubMed Central

    Bosman, Angela B.; Wagenaar, Jaap; Stegeman, Arjan; Vernooij, Hans; Mevius, Dik

    2012-01-01

    This study was performed to determine a sampling strategy to quantify the prevalence of antimicrobial resistance on veal calf farms, based on the variation in antimicrobial resistance within and between calves on five farms. Faecal samples from 50 healthy calves (10 calves/farm) were collected. From each individual sample and one pooled faecal sample per farm, 90 selected Escherichia coli isolates were tested for their resistance against 25 mg/L amoxicillin, 25 mg/L tetracycline, 0.5 mg/L cefotaxime, 0.125 mg/L ciprofloxacin and 8/152 mg/L trimethoprim/sulfamethoxazole (tmp/s) by replica plating. From each faecal sample another 10 selected E. coli isolates were tested for their resistance by broth microdilution as a reference. Logistic regression analysis was performed to compare the odds of testing an isolate resistant between both test methods (replica plating vs. broth microdilution) and to evaluate the effect of pooling faecal samples. Bootstrap analysis was used to investigate the precision of the estimated prevalence of resistance to each antimicrobial obtained by several simulated sampling strategies. Replica plating showed similar odds of E. coli isolates tested resistant compared to broth microdilution, except for ciprofloxacin (OR 0.29, p≤0.05). Pooled samples showed in general lower odds of an isolate being resistant compared to individual samples, although these differences were not significant. Bootstrap analysis showed that within each antimicrobial the various compositions of a pooled sample provided consistent estimates for the mean proportion of resistant isolates. Sampling strategies should be based on the variation in resistance among isolates within faecal samples and between faecal samples, which may vary by antimicrobial. In our study, the optimal sampling strategy from the perspective of precision of the estimated levels of resistance and practicality consists of a pooled faecal sample from 20 individual animals, of which 90 isolates are tested for their susceptibility by replica plating. PMID:22970313

  14. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    PubMed Central

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  15. SPARCLE: Space Plasma Alleviation of Regolith Concentrations in the Lunar Environment

    NASA Astrophysics Data System (ADS)

    Clark, P. E.; Keller, J. W.; Curtis, S. A.; Nuth, J. A.; Stubbs, T. J.; Farrell, W. M.

    2006-05-01

    The return of robotic devices and humans to the Moon will occur in the near future. Based on our previous experience, surface dust is a major problem requiring a solution: During Apollo landings, extensive locally- induced stirring of the regolith caused dust to be suspended long enough to come into contact with conducting surfaces. Dust behaved like abrasive Velcro: it adhered to everything and attempts to remove it by simply brushing did not remove fines (<10) and resulted in severe abrasion. Lunar fines, because of their electrostatic charging, were relatively difficult to collect in sample bags along with other size range particles. Within hours, seals were broken, samples contaminated, and portions of the samples, especially fines, lost. Because of this difficulty, details on lunar dust are relatively sparse. Obviously, the strategies initially implemented to deal with lunar dust failed. A major technological challenge will be developing a dust mitigation strategy. A currently proposed strategy based increased magnetic susceptibility in lunar fines may not work uniformly well for fines of non-mare, or non-lunar, composition. Based on dust behavior already observed on previous missions, we believe the successful strategy will deal with dust dynamics resulting from interaction between mechanical and electrostatic forces. We are planning test and develop an electrostatically-based device to modulate the electrical potential of conducting surfaces, hence to self clean exposed surfaces while collecting dust samples. It would scan a surface constantly to control its potential, and a plate of the opposite potential. As a first step, an experimental low mass, power, and volume device with complimentary electron and ion guns with specially designed self-cleaning nozzles are being designed for to test our concept and develop a working charging and discharging strategy in the lunar environment. Meanwhile, a laboratory simulation will act as a feasibility study for a laboratory breadboard self-cleaning device based on the use of combined electron or ion beams. The compact device would act as plasma dust sweeper.

  16. Technical bases and guidance for the use of composite soil sampling for demonstrating compliance with radiological release criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vitkus, Timothy J.

    2012-04-24

    This guidance provides information on methodologies and the technical bases that licensees should consider for incorporating composite sampling strategies into final status survey (FSS) plans. In addition, this guidance also includes appropriate uses of composite sampling for generating the data for other decommissioning site investigations such as characterization or other preliminary site investigations.

  17. Data Centric Sensor Stream Reduction for Real-Time Applications in Wireless Sensor Networks

    PubMed Central

    Aquino, Andre Luiz Lins; Nakamura, Eduardo Freire

    2009-01-01

    This work presents a data-centric strategy to meet deadlines in soft real-time applications in wireless sensor networks. This strategy considers three main aspects: (i) The design of real-time application to obtain the minimum deadlines; (ii) An analytic model to estimate the ideal sample size used by data-reduction algorithms; and (iii) Two data-centric stream-based sampling algorithms to perform data reduction whenever necessary. Simulation results show that our data-centric strategies meet deadlines without loosing data representativeness. PMID:22303145

  18. Effect of different bleaching strategies on microhardness of a silorane-based composite resin.

    PubMed

    Bahari, Mahmoud; Savadi Oskoee, Siavash; Mohammadi, Narmin; Ebrahimi Chaharom, Mohammad Esmaeel; Godrati, Mostafa; Savadi Oskoee, Ayda

    2016-01-01

    Background. Dentists' awareness of the effects of bleaching agents on the surface and mechanical properties of restorative materials is of utmost importance. Therefore, this in vitro study was undertaken to investigate the effects of different bleaching strategies on the microhardness of a silorane-based composite resin. Methods. Eighty samples of a silorane-based composite resin (measuring 4 mm in diameter and 2 mm in thickness) were prepared within acrylic molds. The samples were polished and randomly assigned to 4 groups (n=20). Group 1 (controls) were stored in distilled water for 2 weeks. The samples in group 2 underwent a bleaching procedure with 15% carbamide peroxide for two weeks two hours daily. The samples in group 3 were bleached with 35% hydrogen peroxide twice 5 days apart for 30 minutes each time. The samples in group 4 underwent a bleaching procedure with light-activated 35% hydrogen peroxide under LED light once for 40 minutes. Then the microhardness of the samples was determined using Vickers method. Data were analyzed with one-way ANOVA and post hoc Tukey tests (P < 0.05). Results. All the bleaching agents significantly decreased microhardness compared to the control group (P < 0.05). In addition, there were significant differences in microhardness between groups 2 and 4 (P = 0.001) and between groups 3 and 4 (P<0.001). However, no significant differences were detected in microhardness between groups 2 and 3 (P > 0.05). Conclusion. Bleaching agents decreased microhardness of silorane-based composite resin restorations, the magnitude of which depending on the bleaching strategy used.

  19. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    PubMed

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. An efficient adaptive sampling strategy for global surrogate modeling with applications in multiphase flow simulation

    NASA Astrophysics Data System (ADS)

    Mo, S.; Lu, D.; Shi, X.; Zhang, G.; Ye, M.; Wu, J.

    2016-12-01

    Surrogate models have shown remarkable computational efficiency in hydrological simulations involving design space exploration, sensitivity analysis, uncertainty quantification, etc. The central task of constructing a global surrogate models is to achieve a prescribed approximation accuracy with as few original model executions as possible, which requires a good design strategy to optimize the distribution of data points in the parameter domains and an effective stopping criterion to automatically terminate the design process when desired approximation accuracy is achieved. This study proposes a novel adaptive sampling strategy, which starts from a small number of initial samples and adaptively selects additional samples by balancing the collection in unexplored regions and refinement in interesting areas. We define an efficient and effective evaluation metric basing on Taylor expansion to select the most promising potential samples from candidate points, and propose a robust stopping criterion basing on the approximation accuracy at new points to guarantee the achievement of desired accuracy. The numerical results of several benchmark analytical functions indicate that the proposed approach is more computationally efficient and robust than the widely used maximin distance design and two other well-known adaptive sampling strategies. The application to two complicated multiphase flow problems further demonstrates the efficiency and effectiveness of our method in constructing global surrogate models for high-dimensional and highly nonlinear problems. Acknowledgements: This work was financially supported by the National Nature Science Foundation of China grants No. 41030746 and 41172206.

  1. Learning Strategies of Successful and Unsuccessful University Students

    ERIC Educational Resources Information Center

    Simsek, Ali; Balaban, Jale

    2010-01-01

    The purpose of this study was to assess the most commonly used learning strategies of undergraduate students and how these strategies were related to their academic performance. Toward this purpose, a 60 item Likert scale was administered to a sample of 278 undergraduate students. The students were selected based on their cumulative…

  2. Evaluation of different approaches for identifying optimal sites to predict mean hillslope soil moisture content

    NASA Astrophysics Data System (ADS)

    Liao, Kaihua; Zhou, Zhiwen; Lai, Xiaoming; Zhu, Qing; Feng, Huihui

    2017-04-01

    The identification of representative soil moisture sampling sites is important for the validation of remotely sensed mean soil moisture in a certain area and ground-based soil moisture measurements in catchment or hillslope hydrological studies. Numerous approaches have been developed to identify optimal sites for predicting mean soil moisture. Each method has certain advantages and disadvantages, but they have rarely been evaluated and compared. In our study, surface (0-20 cm) soil moisture data from January 2013 to March 2016 (a total of 43 sampling days) were collected at 77 sampling sites on a mixed land-use (tea and bamboo) hillslope in the hilly area of Taihu Lake Basin, China. A total of 10 methods (temporal stability (TS) analyses based on 2 indices, K-means clustering based on 6 kinds of inputs and 2 random sampling strategies) were evaluated for determining optimal sampling sites for mean soil moisture estimation. They were TS analyses based on the smallest index of temporal stability (ITS, a combination of the mean relative difference and standard deviation of relative difference (SDRD)) and based on the smallest SDRD, K-means clustering based on soil properties and terrain indices (EFs), repeated soil moisture measurements (Theta), EFs plus one-time soil moisture data (EFsTheta), and the principal components derived from EFs (EFs-PCA), Theta (Theta-PCA), and EFsTheta (EFsTheta-PCA), and global and stratified random sampling strategies. Results showed that the TS based on the smallest ITS was better (RMSE = 0.023 m3 m-3) than that based on the smallest SDRD (RMSE = 0.034 m3 m-3). The K-means clustering based on EFsTheta (-PCA) was better (RMSE <0.020 m3 m-3) than these based on EFs (-PCA) and Theta (-PCA). The sampling design stratified by the land use was more efficient than the global random method. Forty and 60 sampling sites are needed for stratified sampling and global sampling respectively to make their performances comparable to the best K-means method (EFsTheta-PCA). Overall, TS required only one site, but its accuracy was limited. The best K-means method required <8 sites and yielded high accuracy, but extra soil and terrain information is necessary when using this method. The stratified sampling strategy can only be used if no pre-knowledge about soil moisture variation is available. This information will help in selecting the optimal methods for estimation the area mean soil moisture.

  3. Females' sampling strategy to comparatively evaluate prospective mates in the peacock blenny Salaria pavo.

    PubMed

    Locatello, Lisa; Rasotto, Maria B

    2017-08-01

    Emerging evidence suggests the occurrence of comparative decision-making processes in mate choice, questioning the traditional idea of female choice based on rules of absolute preference. In such a scenario, females are expected to use a typical best-of-n sampling strategy, being able to recall previous sampled males based on memory of their quality and location. Accordingly, the quality of preferred mate is expected to be unrelated to both the number and the sequence of female visits. We found support for these predictions in the peacock blenny, Salaria pavo, a fish where females have the opportunity to evaluate the attractiveness of many males in a short time period and in a restricted spatial range. Indeed, even considering the variability in preference among females, most of them returned to previous sampled males for further evaluations; thus, the preferred male did not represent the last one in the sequence of visited males. Moreover, there was no relationship between the attractiveness of the preferred male and the number of further visits assigned to the other males. Our results suggest the occurrence of a best-of-n mate sampling strategy in the peacock blenny.

  4. Females' sampling strategy to comparatively evaluate prospective mates in the peacock blenny Salaria pavo

    NASA Astrophysics Data System (ADS)

    Locatello, Lisa; Rasotto, Maria B.

    2017-08-01

    Emerging evidence suggests the occurrence of comparative decision-making processes in mate choice, questioning the traditional idea of female choice based on rules of absolute preference. In such a scenario, females are expected to use a typical best-of- n sampling strategy, being able to recall previous sampled males based on memory of their quality and location. Accordingly, the quality of preferred mate is expected to be unrelated to both the number and the sequence of female visits. We found support for these predictions in the peacock blenny, Salaria pavo, a fish where females have the opportunity to evaluate the attractiveness of many males in a short time period and in a restricted spatial range. Indeed, even considering the variability in preference among females, most of them returned to previous sampled males for further evaluations; thus, the preferred male did not represent the last one in the sequence of visited males. Moreover, there was no relationship between the attractiveness of the preferred male and the number of further visits assigned to the other males. Our results suggest the occurrence of a best-of- n mate sampling strategy in the peacock blenny.

  5. The Students' Perceptions of School Success Promoting Strategies Inventory (SPSI): development and validity evidence based studies.

    PubMed

    Moreira, Paulo A S; Oliveira, João Tiago; Dias, Paulo; Vaz, Filipa Machado; Torres-Oliveira, Isabel

    2014-08-04

    Students' perceptions about school success promotion strategies are of great importance for schools, as they are an indicator of how students perceive the school success promotion strategies. The objective of this study was to develop and analyze the validity evidence based of The Students' Perceptions of School Success Promoting Strategies Inventory (SPSI), which assesses both individual students' perceptions of their school success promoting strategies, and dimensions of school quality. A structure of 7 related factors was found, which showed good adjustment indices in two additional different samples, suggesting that this is a well-fitting multi-group model (p < .001). All scales presented good reliability values. Schools with good academic results registered higher values in Career development, Active learning, Proximity, Educational Technologies and Extra-curricular activities (p < .05). SPSI showed to be adequate to measure within-schools (students within schools) dimensions of school success. In addition, there is preliminary evidence for its adequacy for measuring school success promotion dimensions between schools for 4 dimensions. This study supports the validity evidence based of the SPSI (validity evidence based on test content, on internal structure, on relations to other variables and on consequences of testing). Future studies should test for within- and between-level variance in a bigger sample of schools.

  6. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less

  7. Improving riverine constituent concentration and flux estimation by accounting for antecedent discharge conditions

    NASA Astrophysics Data System (ADS)

    Zhang, Qian; Ball, William P.

    2017-04-01

    Regression-based approaches are often employed to estimate riverine constituent concentrations and fluxes based on typically sparse concentration observations. One such approach is the recently developed WRTDS ("Weighted Regressions on Time, Discharge, and Season") method, which has been shown to provide more accurate estimates than prior approaches in a wide range of applications. Centered on WRTDS, this work was aimed at developing improved models for constituent concentration and flux estimation by accounting for antecedent discharge conditions. Twelve modified models were developed and tested, each of which contains one additional flow variable to represent antecedent conditions and which can be directly derived from the daily discharge record. High-resolution (∼daily) data at nine diverse monitoring sites were used to evaluate the relative merits of the models for estimation of six constituents - chloride (Cl), nitrate-plus-nitrite (NOx), total Kjeldahl nitrogen (TKN), total phosphorus (TP), soluble reactive phosphorus (SRP), and suspended sediment (SS). For each site-constituent combination, 30 concentration subsets were generated from the original data through Monte Carlo subsampling and then used to evaluate model performance. For the subsampling, three sampling strategies were adopted: (A) 1 random sample each month (12/year), (B) 12 random monthly samples plus additional 8 random samples per year (20/year), and (C) flow-stratified sampling with 12 regular (non-storm) and 8 storm samples per year (20/year). Results reveal that estimation performance varies with both model choice and sampling strategy. In terms of model choice, the modified models show general improvement over the original model under all three sampling strategies. Major improvements were achieved for NOx by the long-term flow-anomaly model and for Cl by the ADF (average discounted flow) model and the short-term flow-anomaly model. Moderate improvements were achieved for SS, TP, and TKN by the ADF model. By contrast, no such achievement was achieved for SRP by any proposed model. In terms of sampling strategy, performance of all models (including the original) was generally best using strategy C and worst using strategy A, and especially so for SS, TP, and SRP, confirming the value of routinely collecting stormflow samples. Overall, this work provides a comprehensive set of statistical evidence for supporting the incorporation of antecedent discharge conditions into the WRTDS model for estimation of constituent concentration and flux, thereby combining the advantages of two recent developments in water quality modeling.

  8. Developmental Changes in the Consideration of Sample Diversity in Inductive Reasoning

    ERIC Educational Resources Information Center

    Rhodes, Marjorie; Gelman, Susan A.; Brickman, Daniel

    2008-01-01

    Determining whether a sample provides a good basis for broader generalizations is a basic challenge of inductive reasoning. Adults apply a diversity-based strategy to this challenge, expecting diverse samples to be a better basis for generalization than homogeneous samples. For example, adults expect that a property shared by two diverse mammals…

  9. Review of sampling hard-to-reach and hidden populations for HIV surveillance.

    PubMed

    Magnani, Robert; Sabin, Keith; Saidel, Tobi; Heckathorn, Douglas

    2005-05-01

    Adequate surveillance of hard-to-reach and 'hidden' subpopulations is crucial to containing the HIV epidemic in low prevalence settings and in slowing the rate of transmission in high prevalence settings. For a variety of reasons, however, conventional facility and survey-based surveillance data collection strategies are ineffective for a number of key subpopulations, particularly those whose behaviors are illegal or illicit. This paper critically reviews alternative sampling strategies for undertaking behavioral or biological surveillance surveys of such groups. Non-probability sampling approaches such as facility-based sentinel surveillance and snowball sampling are the simplest to carry out, but are subject to a high risk of sampling/selection bias. Most of the probability sampling methods considered are limited in that they are adequate only under certain circumstances and for some groups. One relatively new method, respondent-driven sampling, an adaptation of chain-referral sampling, appears to be the most promising for general applications. However, as its applicability to HIV surveillance in resource-poor settings has yet to be established, further field trials are needed before a firm conclusion can be reached.

  10. Improving Riverine Constituent Concentration and Flux Estimation by Accounting for Antecedent Discharge Conditions

    NASA Astrophysics Data System (ADS)

    Zhang, Q.; Ball, W. P.

    2016-12-01

    Regression-based approaches are often employed to estimate riverine constituent concentrations and fluxes based on typically sparse concentration observations. One such approach is the WRTDS ("Weighted Regressions on Time, Discharge, and Season") method, which has been shown to provide more accurate estimates than prior approaches. Centered on WRTDS, this work was aimed at developing improved models for constituent concentration and flux estimation by accounting for antecedent discharge conditions. Twelve modified models were developed and tested, each of which contains one additional variable to represent antecedent conditions. High-resolution ( daily) data at nine monitoring sites were used to evaluate the relative merits of the models for estimation of six constituents - chloride (Cl), nitrate-plus-nitrite (NOx), total Kjeldahl nitrogen (TKN), total phosphorus (TP), soluble reactive phosphorus (SRP), and suspended sediment (SS). For each site-constituent combination, 30 concentration subsets were generated from the original data through Monte Carlo sub-sampling and then used to evaluate model performance. For the sub-sampling, three sampling strategies were adopted: (A) 1 random sample each month (12/year), (B) 12 random monthly samples plus additional 8 random samples per year (20/year), and (C) 12 regular (non-storm) and 8 storm samples per year (20/year). The modified models show general improvement over the original model under all three sampling strategies. Major improvements were achieved for NOx by the long-term flow-anomaly model and for Cl by the ADF (average discounted flow) model and the short-term flow-anomaly model. Moderate improvements were achieved for SS, TP, and TKN by the ADF model. By contrast, no such achievement was achieved for SRP by any proposed model. In terms of sampling strategy, performance of all models was generally best using strategy C and worst using strategy A, and especially so for SS, TP, and SRP, confirming the value of routinely collecting storm-flow samples. Overall, this work provides a comprehensive set of statistical evidence for supporting the incorporation of antecedent discharge conditions into WRTDS for constituent concentration and flux estimation, thereby combining the advantages of two recent developments in water quality modeling.

  11. Spatial Dependence and Sampling of Phytoseiid Populations on Hass Avocados in Southern California.

    PubMed

    Lara, Jesús R; Amrich, Ruth; Saremi, Naseem T; Hoddle, Mark S

    2016-04-22

    Research on phytoseiid mites has been critical for developing an effective biocontrol strategy for suppressing Oligonchus perseae Tuttle, Baker, and Abatiello (Acari: Tetranychidae) in California avocado orchards. However, basic understanding of the spatial ecology of natural populations of phytoseiids in relation to O. perseae infestations and the validation of research-based strategies for assessing densities of these predators has been limited. To address these shortcomings, cross-sectional and longitudinal observations consisting of >3,000 phytoseiids and 500,000 O. perseae counted on 11,341 leaves were collected across 10 avocado orchards during a 10-yr period. Subsets of these data were analyzed statistically to characterize the spatial distribution of phytoseiids in avocado orchards and to evaluate the merits of developing binomial and enumerative sampling strategies for these predators. Spatial correlation of phytoseiids between trees was detected at one site, and a strong association of phytoseiids with elevated O. perseae densities was detected at four sites. Sampling simulations revealed that enumeration-based sampling performed better than binomial sampling for estimating phytoseiid densities. The ecological implications of these findings and potential for developing a custom sampling plan to estimate densities of phytoseiids inhabiting sampled trees in avocado orchards in California are discussed. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Draft evaluation of the frequency for gas sampling for the high burnup confirmatory data project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stockman, Christine T.; Alsaed, Halim A.; Bryan, Charles R.

    2015-03-26

    This report fulfills the M3 milestone M3FT-15SN0802041, “Draft Evaluation of the Frequency for Gas Sampling for the High Burn-up Storage Demonstration Project” under Work Package FT-15SN080204, “ST Field Demonstration Support – SNL”. This report provides a technically based gas sampling frequency strategy for the High Burnup (HBU) Confirmatory Data Project. The evaluation of: 1) the types and magnitudes of gases that could be present in the project cask and, 2) the degradation mechanisms that could change gas compositions culminates in an adaptive gas sampling frequency strategy. This adaptive strategy is compared against the sampling frequency that has been developed basedmore » on operational considerations. Gas sampling will provide information on the presence of residual water (and byproducts associated with its reactions and decomposition) and breach of cladding, which could inform the decision of when to open the project cask.« less

  13. Representativeness-based sampling network design for the State of Alaska

    Treesearch

    Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove

    2013-01-01

    Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...

  14. GARN: Sampling RNA 3D Structure Space with Game Theory and Knowledge-Based Scoring Strategies.

    PubMed

    Boudard, Mélanie; Bernauer, Julie; Barth, Dominique; Cohen, Johanne; Denise, Alain

    2015-01-01

    Cellular processes involve large numbers of RNA molecules. The functions of these RNA molecules and their binding to molecular machines are highly dependent on their 3D structures. One of the key challenges in RNA structure prediction and modeling is predicting the spatial arrangement of the various structural elements of RNA. As RNA folding is generally hierarchical, methods involving coarse-grained models hold great promise for this purpose. We present here a novel coarse-grained method for sampling, based on game theory and knowledge-based potentials. This strategy, GARN (Game Algorithm for RNa sampling), is often much faster than previously described techniques and generates large sets of solutions closely resembling the native structure. GARN is thus a suitable starting point for the molecular modeling of large RNAs, particularly those with experimental constraints. GARN is available from: http://garn.lri.fr/.

  15. Comparative effectiveness and acceptability of home-based and clinic-based sampling methods for sexually transmissible infections screening in females aged 14-50 years: a systematic review and meta-analysis.

    PubMed

    Odesanmi, Tolulope Y; Wasti, Sharada P; Odesanmi, Omolola S; Adegbola, Omololu; Oguntuase, Olubukola O; Mahmood, Sajid

    2013-12-01

    Home-based sampling is a strategy to enhance uptake of sexually transmissible infection (STI) screening. This review aimed to compare the screening uptake levels of home-based self-sampling and clinic-based specimen collection for STIs (chlamydia (Chlamydia trachomatis), gonorrhoea (Neisseria gonorrhoeae) and trichomoniasis) in females aged 14-50 years. Acceptability and effect on specimen quality were determined. Sixteen electronic databases were searched from inception to September 2012. Randomised controlled trials (RCTs) comparing the uptake levels of home-based self-sampling and clinic-based sampling for chlamydia, gonorrhoea and trichomoniasis in females aged 14-50 years were eligible for inclusion. The risk of bias in the trials was assessed. Risk ratios (RRs) for dichotomous outcomes were meta-analysed. Of 3065 papers, six studies with seven RCTs contributed to the final review. Compared with clinic-based methods, home-based screening increased uptake significantly (P=0.001-0.05) in five trials and was substantiated in a meta-analysis (RR: 1.55; 95% confidence interval: 1.30-1.85; P=0.00001) of two trials. In three trials, a significant preference for home-based testing (P=0.001-0.05) was expressed. No significant difference was observed in specimen quality. Sampling was rated as easy by a significantly higher number of women (P=0.01) in the clinic group in one trial. The review provides evidence that home-based testing results in greater uptake of STI screening in females (14-50 years) than clinic-based testing without compromising quality in the developed world. Home collection strategies should be added to clinic-based screening programs to enhance uptake.

  16. Design and implementation of real-time wireless projection system based on ARM embedded system

    NASA Astrophysics Data System (ADS)

    Long, Zhaohua; Tang, Hao; Huang, Junhua

    2018-04-01

    Aiming at the shortage of existing real-time screen sharing system, a real-time wireless projection system is proposed in this paper. Based on the proposed system, a weight-based frame deletion strategy combined sampling time period and data variation is proposed. By implementing the system on the hardware platform, the results show that the system can achieve good results. The weight-based strategy can improve the service quality, reduce the delay and optimize the real-time customer service system [1].

  17. Highly sensitive dendrimer-based nanoplasmonic biosensor for drug allergy diagnosis.

    PubMed

    Soler, Maria; Mesa-Antunez, Pablo; Estevez, M-Carmen; Ruiz-Sanchez, Antonio Jesus; Otte, Marinus A; Sepulveda, Borja; Collado, Daniel; Mayorga, Cristobalina; Torres, Maria Jose; Perez-Inestrosa, Ezequiel; Lechuga, Laura M

    2015-04-15

    A label-free biosensing strategy for amoxicillin (AX) allergy diagnosis based on the combination of novel dendrimer-based conjugates and a recently developed nanoplasmonic sensor technology is reported. Gold nanodisks were functionalized with a custom-designed thiol-ending-polyamido-based dendron (d-BAPAD) peripherally decorated with amoxicilloyl (AXO) groups (d-BAPAD-AXO) in order to detect specific IgE generated in patient's serum against this antibiotic during an allergy outbreak. This innovative strategy, which follows a simple one-step immobilization procedure, shows exceptional results in terms of sensitivity and robustness, leading to a highly-reproducible and long-term stable surface which allows achieving extremely low limits of detection. Moreover, the viability of this biosensor approach to analyze human biological samples has been demonstrated by directly analyzing and quantifying specific anti-AX antibodies in patient's serum without any sample pretreatment. An excellent limit of detection (LoD) of 0.6ng/mL (i.e. 0.25kU/L) has been achieved in the evaluation of clinical samples evidencing the potential of our nanoplasmonic biosensor as an advanced diagnostic tool to quickly identify allergic patients. The results have been compared and validated with a conventional clinical immunofluorescence assay (ImmunoCAP test), confirming an excellent correlation between both techniques. The combination of a novel compact nanoplasmonic platform and a dendrimer-based strategy provides a highly sensitive label free biosensor approach with over two times better detectability than conventional SPR. Both the biosensor device and the carrier structure hold great potential in clinical diagnosis for biomarker analysis in whole serum samples and other human biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Methods of developing core collections based on the predicted genotypic value of rice ( Oryza sativa L.).

    PubMed

    Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L

    2004-04-01

    The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.

  19. Adapted random sampling patterns for accelerated MRI.

    PubMed

    Knoll, Florian; Clason, Christian; Diwoky, Clemens; Stollberger, Rudolf

    2011-02-01

    Variable density random sampling patterns have recently become increasingly popular for accelerated imaging strategies, as they lead to incoherent aliasing artifacts. However, the design of these sampling patterns is still an open problem. Current strategies use model assumptions like polynomials of different order to generate a probability density function that is then used to generate the sampling pattern. This approach relies on the optimization of design parameters which is very time consuming and therefore impractical for daily clinical use. This work presents a new approach that generates sampling patterns by making use of power spectra of existing reference data sets and hence requires neither parameter tuning nor an a priori mathematical model of the density of sampling points. The approach is validated with downsampling experiments, as well as with accelerated in vivo measurements. The proposed approach is compared with established sampling patterns, and the generalization potential is tested by using a range of reference images. Quantitative evaluation is performed for the downsampling experiments using RMS differences to the original, fully sampled data set. Our results demonstrate that the image quality of the method presented in this paper is comparable to that of an established model-based strategy when optimization of the model parameter is carried out and yields superior results to non-optimized model parameters. However, no random sampling pattern showed superior performance when compared to conventional Cartesian subsampling for the considered reconstruction strategy.

  20. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  1. A Critical Assessment of Bias in Survey Studies Using Location-Based Sampling to Recruit Patrons in Bars

    PubMed Central

    Morrison, Christopher; Lee, Juliet P.; Gruenewald, Paul J.; Marzell, Miesha

    2015-01-01

    Location-based sampling is a method to obtain samples of people within ecological contexts relevant to specific public health outcomes. Random selection increases generalizability, however in some circumstances (such as surveying bar patrons) recruitment conditions increase risks of sample bias. We attempted to recruit representative samples of bars and patrons in six California cities, but low response rates precluded meaningful analysis. A systematic review of 24 similar studies revealed that none addressed the key shortcomings of our study. We recommend steps to improve studies that use location-based sampling: (i) purposively sample places of interest, (ii) utilize recruitment strategies appropriate to the environment, and (iii) provide full information on response rates at all levels of sampling. PMID:26574657

  2. Improvements to robotics-inspired conformational sampling in rosetta.

    PubMed

    Stein, Amelie; Kortemme, Tanja

    2013-01-01

    To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC) method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new "next-generation KIC" method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions.

  3. Improvements to Robotics-Inspired Conformational Sampling in Rosetta

    PubMed Central

    Stein, Amelie; Kortemme, Tanja

    2013-01-01

    To accurately predict protein conformations in atomic detail, a computational method must be capable of sampling models sufficiently close to the native structure. All-atom sampling is difficult because of the vast number of possible conformations and extremely rugged energy landscapes. Here, we test three sampling strategies to address these difficulties: conformational diversification, intensification of torsion and omega-angle sampling and parameter annealing. We evaluate these strategies in the context of the robotics-based kinematic closure (KIC) method for local conformational sampling in Rosetta on an established benchmark set of 45 12-residue protein segments without regular secondary structure. We quantify performance as the fraction of sub-Angstrom models generated. While improvements with individual strategies are only modest, the combination of intensification and annealing strategies into a new “next-generation KIC” method yields a four-fold increase over standard KIC in the median percentage of sub-Angstrom models across the dataset. Such improvements enable progress on more difficult problems, as demonstrated on longer segments, several of which could not be accurately remodeled with previous methods. Given its improved sampling capability, next-generation KIC should allow advances in other applications such as local conformational remodeling of multiple segments simultaneously, flexible backbone sequence design, and development of more accurate energy functions. PMID:23704889

  4. Engineering-Based Problem Solving Strategies in AP Calculus: An Investigation into High School Student Performance on Related Rate Free-Response Problems

    ERIC Educational Resources Information Center

    Thieken, John

    2012-01-01

    A sample of 127 high school Advanced Placement (AP) Calculus students from two schools was utilized to study the effects of an engineering design-based problem solving strategy on student performance with AP style Related Rate questions and changes in conceptions, beliefs, and influences. The research design followed a treatment-control multiple…

  5. The Effect of Using a Proposed Teaching Strategy Based on the Selective Thinking on Students' Acquisition Concepts in Mathematics

    ERIC Educational Resources Information Center

    Qudah, Ahmad Hassan

    2016-01-01

    This study aimed at identify the effect of using a proposed teaching strategy based on the selective thinking in acquire mathematical concepts by Classroom Teacher Students at Al- al- Bayt University, The sample of the study consisted of (74) students, equally distributed into a control group and an experimental group. The selective thinking…

  6. Advanced Navigation Strategies For Asteroid Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.

    2010-01-01

    Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.

  7. Surface sampling techniques for 3D object inspection

    NASA Astrophysics Data System (ADS)

    Shih, Chihhsiong S.; Gerhardt, Lester A.

    1995-03-01

    While the uniform sampling method is quite popular for pointwise measurement of manufactured parts, this paper proposes three novel sampling strategies which emphasize 3D non-uniform inspection capability. They are: (a) the adaptive sampling, (b) the local adjustment sampling, and (c) the finite element centroid sampling techniques. The adaptive sampling strategy is based on a recursive surface subdivision process. Two different approaches are described for this adaptive sampling strategy. One uses triangle patches while the other uses rectangle patches. Several real world objects were tested using these two algorithms. Preliminary results show that sample points are distributed more closely around edges, corners, and vertices as desired for many classes of objects. Adaptive sampling using triangle patches is shown to generally perform better than both uniform and adaptive sampling using rectangle patches. The local adjustment sampling strategy uses a set of predefined starting points and then finds the local optimum position of each nodal point. This method approximates the object by moving the points toward object edges and corners. In a hybrid approach, uniform points sets and non-uniform points sets, first preprocessed by the adaptive sampling algorithm on a real world object were then tested using the local adjustment sampling method. The results show that the initial point sets when preprocessed by adaptive sampling using triangle patches, are moved the least amount of distance by the subsequently applied local adjustment method, again showing the superiority of this method. The finite element sampling technique samples the centroids of the surface triangle meshes produced from the finite element method. The performance of this algorithm was compared to that of the adaptive sampling using triangular patches. The adaptive sampling with triangular patches was once again shown to be better on different classes of objects.

  8. From Global Reconnaissance to Sample Return: A Proposal for a Post-2009 Strategy to Follow the Water on Mars

    NASA Technical Reports Server (NTRS)

    Clifford, S. M.; George, J. A.; Stoker, C. R.; Briggs, G.

    2003-01-01

    Since the mid-1990's, the stated strategy of the Mars Exploration Program has been to Follow the Water. Although this strategy has been widely publicized, its degree of influence -- and the logic behind its current implementation (as reflected in mission planning, platform and instrument selection, and allocation of spacecraft resources) remains unclear. In response to this concern, we propose an integrated strategy for the post-2009 exploration of Mars that identifies the scientific objectives, rationale, sequence of missions, and specific investigations, that we believe provides the maximum possible science return by pursuing the most direct, cost-effective, and technically capable approach to following the water. This strategy is based on the orbital identification, high-resolution surface investigation, and ultimate sampling of the highest priority targets: near-surface liquid water and massive ground ice (potentially associated with the discharge of the outlflow channels or the relic of a former ocean). The analysis of such samples, in conjunction with the data acquired by the necessary precursor investigations (to identify the locations and characterize the environments of the optimum sampling sites), is expected to address a majority of the goals and high priority science objectives identified by MEPAG.

  9. Development of a Strategy Based on the Surface Plasmon Resonance Technology for Platelet Compatibility Testing.

    PubMed

    Wu, Chang-Lin; He, Jian-An; Gu, Da-Yong; Shao, Chao-Peng; Zhu, Yi; Dang, Xin-Tang

    2018-01-01

    This study was aimed to establish a novel strategy based on the surface plasmon resonance (SPR) technology for platelet compatibility testing. A novel surface matrix was prepared based on poly (OEGMA-co-HEMA) via surface-initiated polymerization as a biosensor surface platform. Type O universal platelets and donor platelets were immobilized on these novel matrices via amine-coupling reaction and worked as a capturing ligand for binding the platelet antibody. Antibodies binding to platelets were monitored in real time by injecting the samples into a microfluidic channel. Clinical serum samples (n = 186) with multiple platelet transfusions were assayed for platelet antibodies using the SPR technology and monoclonal antibody-immobilized platelet antigen (MAIPA) assay. The novel biosensor surface achieved nonfouling background and high immobilization capacity and showed good repeatability and stability after regeneration. The limit of detection of the SPR biosensor for platelet antibody was estimated to be 50 ng/mL. The sensitivity and specificity were 92% and 98.7%. It could detect the platelet antibody directly in serum samples, and the results were similar to MAIPA assay. A novel strategy to facilitate the sensitive and reliable detection of platelet compatibility for developing an SPR-based biosensor was established in this study. The SPR-based biosensor combined with novel surface chemistry is a promising method for platelet compatibility testing.

  10. Structure-based feeding strategies: A key component of child nutrition.

    PubMed

    Taylor, Maija B; Emley, Elizabeth; Pratt, Mercedes; Musher-Eizenman, Dara R

    2017-07-01

    This study examined the relationship between structure, autonomy promotion, and control feeding strategies and parent-reported child diet. Participants (N = 497) were parents of children ages 2.5 to 7.5 recruited from Amazon Mechanical Turk. This sample was a Caucasian (79%), educated sample (61% college graduates) with most reports from mothers (76%). Online survey including measures of parent feeding strategies and child dietary intake. Use of structure-based feeding strategies explained 21% of the variance in child consumption of added sugar, 12% of the variance in child intake of added sugar from sugar-sweetened beverages, and 16% of the variance in child consumption of fruits and vegetables. Higher unhealthy food availability and permissive feeding uniquely predicted higher child added sugar intake and child consumption of added sugar from sugar-sweetened beverages. Greater healthy food availability uniquely predicted higher child fruit and vegetable intake. and Future Directions: In Caucasian educated families, structure-based feeding strategies appear to be a relatively stronger correlate of parent-reported child intake of added sugar and fruits and vegetables as compared to autonomy promotion and control feeding strategies. Longitudinal research may be needed in order to reveal the relationships between autonomy promotion and control feeding strategies with child diet. If future studies have similar findings to this study's results, researchers may want to focus more heavily on investigating the impact of teaching parents stimulus-control techniques and feeding-related assertiveness skills on child dietary intake. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds

    NASA Astrophysics Data System (ADS)

    Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric

    2016-04-01

    In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.

  12. Grade-related differences in strategy use in multidigit division in two instructional settings.

    PubMed

    Hickendorff, Marian; Torbeyns, Joke; Verschaffel, Lieven

    2017-11-23

    We aimed to investigate upper elementary children's strategy use in the domain of multidigit division in two instructional settings: the Netherlands and Flanders (Belgium). A cross-sectional sample of 119 Dutch and 122 Flemish fourth to sixth graders solved a varied set of multidigit division problems. With latent class analysis, three distinct strategy profiles were identified: children consistently using number-based strategies, children combining the use of column-based and number-based strategies, and children combining the use of digit-based and number-based strategies. The relation between children's strategy profiles and their instructional setting (country) and grade were generally in line with instructional differences, but large individual differences remained. Furthermore, Dutch children more frequently made adaptive strategy choices and realistic solutions than their Flemish peers. These results complement and refine previous findings on children's strategy use in relation to mathematics instruction. Statement of contribution What is already known? Mathematics education reform emphasizes variety, adaptivity, and insight in arithmetic strategies. Countries have different instructional trajectories for multidigit division. Mixed results on the impact of instruction on children's strategy use in multidigit division. What does this study add? Latent class analysis identified three meaningful strategy profiles in children from grades 4-6. These strategy profiles substantially differed between children. Dutch and Flemish children's strategy use is related to their instructional trajectory. © 2017 The Authors. British Journal of Developmental Psychology published by John Wiley & Sons Ltd on behalf of British Psychological Society.

  13. A Limited Repertoire of Emotion Regulation Strategies Is Associated with Internalizing Problems in Adolescence

    ERIC Educational Resources Information Center

    Lougheed, Jessica P.; Hollenstein, Tom

    2012-01-01

    The present study was designed to test whether the beneficial effects of emotion regulation (ER) have less to do with the use of singular, "adaptive" strategies and more to do with using a range of strategies. Using a community sample of adolescents (N = 177, M = 13.6 years), groups based on five measures of ER (reappraisal, suppression,…

  14. Direct and long-term detection of gene doping in conventional blood samples.

    PubMed

    Beiter, T; Zimmermann, M; Fragasso, A; Hudemann, J; Niess, A M; Bitzer, M; Lauer, U M; Simon, P

    2011-03-01

    The misuse of somatic gene therapy for the purpose of enhancing athletic performance is perceived as a coming threat to the world of sports and categorized as 'gene doping'. This article describes a direct detection approach for gene doping that gives a clear yes-or-no answer based on the presence or absence of transgenic DNA in peripheral blood samples. By exploiting a priming strategy to specifically amplify intronless DNA sequences, we developed PCR protocols allowing the detection of very small amounts of transgenic DNA in genomic DNA samples to screen for six prime candidate genes. Our detection strategy was verified in a mouse model, giving positive signals from minute amounts (20 μl) of blood samples for up to 56 days following intramuscular adeno-associated virus-mediated gene transfer, one of the most likely candidate vector systems to be misused for gene doping. To make our detection strategy amenable for routine testing, we implemented a robust sample preparation and processing protocol that allows cost-efficient analysis of small human blood volumes (200 μl) with high specificity and reproducibility. The practicability and reliability of our detection strategy was validated by a screening approach including 327 blood samples taken from professional and recreational athletes under field conditions.

  15. An efficient sampling strategy for selection of biobank samples using risk scores.

    PubMed

    Björk, Jonas; Malmqvist, Ebba; Rylander, Lars; Rignell-Hydbom, Anna

    2017-07-01

    The aim of this study was to suggest a new sample-selection strategy based on risk scores in case-control studies with biobank data. An ongoing Swedish case-control study on fetal exposure to endocrine disruptors and overweight in early childhood was used as the empirical example. Cases were defined as children with a body mass index (BMI) ⩾18 kg/m 2 ( n=545) at four years of age, and controls as children with a BMI of ⩽17 kg/m 2 ( n=4472 available). The risk of being overweight was modelled using logistic regression based on available covariates from the health examination and prior to selecting samples from the biobank. A risk score was estimated for each child and categorised as low (0-5%), medium (6-13%) or high (⩾14%) risk of being overweight. The final risk-score model, with smoking during pregnancy ( p=0.001), birth weight ( p<0.001), BMI of both parents ( p<0.001 for both), type of residence ( p=0.04) and economic situation ( p=0.12), yielded an area under the receiver operating characteristic curve of 67% ( n=3945 with complete data). The case group ( n=416) had the following risk-score profile: low (12%), medium (46%) and high risk (43%). Twice as many controls were selected from each risk group, with further matching on sex. Computer simulations showed that the proposed selection strategy with stratification on risk scores yielded consistent improvements in statistical precision. Using risk scores based on available survey or register data as a basis for sample selection may improve possibilities to study heterogeneity of exposure effects in biobank-based studies.

  16. A novel multi-scale adaptive sampling-based approach for energy saving in leak detection for WSN-based water pipelines

    NASA Astrophysics Data System (ADS)

    Saqib, Najam us; Faizan Mysorewala, Muhammad; Cheded, Lahouari

    2017-12-01

    In this paper, we propose a novel monitoring strategy for a wireless sensor networks (WSNs)-based water pipeline network. Our strategy uses a multi-pronged approach to reduce energy consumption based on the use of two types of vibration sensors and pressure sensors, all having different energy levels, and a hierarchical adaptive sampling mechanism to determine the sampling frequency. The sampling rate of the sensors is adjusted according to the bandwidth of the vibration signal being monitored by using a wavelet-based adaptive thresholding scheme that calculates the new sampling frequency for the following cycle. In this multimodal sensing scheme, the duty-cycling approach is used for all sensors to reduce the sampling instances, such that the high-energy, high-precision (HE-HP) vibration sensors have low duty cycles, and the low-energy, low-precision (LE-LP) vibration sensors have high duty cycles. The low duty-cycling (HE-HP) vibration sensor adjusts the sampling frequency of the high duty-cycling (LE-LP) vibration sensor. The simulated test bed considered here consists of a water pipeline network which uses pressure and vibration sensors, with the latter having different energy consumptions and precision levels, at various locations in the network. This is all the more useful for energy conservation for extended monitoring. It is shown that by using the novel features of our proposed scheme, a significant reduction in energy consumption is achieved and the leak is effectively detected by the sensor node that is closest to it. Finally, both the total energy consumed by monitoring as well as the time to detect the leak by a WSN node are computed, and show the superiority of our proposed hierarchical adaptive sampling algorithm over a non-adaptive sampling approach.

  17. Strategy to increase Barangan Banana production in Kabupaten Deli Serdang

    NASA Astrophysics Data System (ADS)

    Adhany, I.; Chalil, D.; Ginting, R.

    2018-02-01

    This study was conducted to analyze internal and external factors in increasing Barangan Banana production in Kabupaten Deli Serdang. Samples were determined by snowball sampling technique and purposive sampling method. Using SWOT analysis method, this study found that there were 6 internal strategic factors and 9 external strategic factors. Among that strategic factors, support for production facilities appears as the most important internal strategic factor, while the demand for Barangan Banana. as the most important external strategic factor. Based on the importance and existing condition of these strategic factors, using support for production facilities and realization of supporting facilities with farming experience are the strategies covering strength-opportunity (SO), organizing mentoring to meet the demand for Barangan Banana are the strategies covering weakness-opportunity (WO), making use of funding support and subsidies to widen the land, using tissue culture seeds and facilities and infrastructures are the strategies covering strength-threat (ST), increas the funding support to widen the land, the use of tissue culture seeds and facilities and infrastructures are the strategies covering weakness-threat (WT) are discussed and proposed to increase Barangan Banana productivity in Kabupaten Deli Serdang.

  18. Timing Recovery Strategies in Magnetic Recording Systems

    NASA Astrophysics Data System (ADS)

    Kovintavewat, Piya

    At some point in a digital communications receiver, the received analog signal must be sampled. Good performance requires that these samples be taken at the right times. The process of synchronizing the sampler with the received analog waveform is known as timing recovery. Conventional timing recovery techniques perform well only when operating at high signal-to-noise ratio (SNR). Nonetheless, iterative error-control codes allow reliable communication at very low SNR, where conventional techniques fail. This paper provides a detailed review on the timing recovery strategies based on per-survivor processing (PSP) that are capable of working at low SNR. We also investigate their performance in magnetic recording systems because magnetic recording is a primary method of storage for a variety of applications, including desktop, mobile, and server systems. Results indicate that the timing recovery strategies based on PSP perform better than the conventional ones and are thus worth being employed in magnetic recording systems.

  19. Effects of cooperative learning strategy on undergraduate kinesiology students' learning styles.

    PubMed

    Meeuwsen, Harry J; King, George A; Pederson, Rockie

    2005-10-01

    A growing body of research supports cooperative learning as an effective teaching strategy. A specific cooperative learning strategy, Team-based Learning, was applied to a convenience sample of four undergraduate sophomore-level motor behavior courses over four semesters from Fall 2002 to Spring 2004 to examine whether this strategy would affect students' learning styles. The data from the Grasha-Reichmann Student Learning Style Scales indicated that this teaching strategy was associated with a significant decrease in the negative Avoidant and Dependent learning styles and an improvement in the positive Participant learning style.

  20. Tandem Extraction/Liquid Chromatography-Mass Spectrometry Protocol for the Analysis of Acrylamide and Surfactant-related Compounds in Complex Aqueous Environmental Samples

    EPA Science Inventory

    The development of a liquid chromatography‐mass spectrometry (LC‐MS)‐based strategy for the detection and quantitation of acrylamide and surfactant‐related compounds in aqueous complex environmental samples.

  1. Multidimensional electrostatic repulsion-hydrophilic interaction chromatography (ERLIC) for quantitative analysis of the proteome and phosphoproteome in clinical and biomedical research.

    PubMed

    Loroch, Stefan; Schommartz, Tim; Brune, Wolfram; Zahedi, René Peiman; Sickmann, Albert

    2015-05-01

    Quantitative proteomics and phosphoproteomics have become key disciplines in understanding cellular processes. Fundamental research can be done using cell culture providing researchers with virtually infinite sample amounts. In contrast, clinical, pre-clinical and biomedical research is often restricted to minute sample amounts and requires an efficient analysis with only micrograms of protein. To address this issue, we generated a highly sensitive workflow for combined LC-MS-based quantitative proteomics and phosphoproteomics by refining an ERLIC-based 2D phosphoproteomics workflow into an ERLIC-based 3D workflow covering the global proteome as well. The resulting 3D strategy was successfully used for an in-depth quantitative analysis of both, the proteome and the phosphoproteome of murine cytomegalovirus-infected mouse fibroblasts, a model system for host cell manipulation by a virus. In a 2-plex SILAC experiment with 150 μg of a tryptic digest per condition, the 3D strategy enabled the quantification of ~75% more proteins and even ~134% more peptides compared to the 2D strategy. Additionally, we could quantify ~50% more phosphoproteins by non-phosphorylated peptides, concurrently yielding insights into changes on the levels of protein expression and phosphorylation. Beside its sensitivity, our novel three-dimensional ERLIC-strategy has the potential for semi-automated sample processing rendering it a suitable future perspective for clinical, pre-clinical and biomedical research. Copyright © 2015. Published by Elsevier B.V.

  2. The Diagnosis of Urinary Tract Infection in Young Children (DUTY) Study Clinical Rule: Economic Evaluation.

    PubMed

    Hollingworth, William; Busby, John; Butler, Christopher C; O'Brien, Kathryn; Sterne, Jonathan A C; Hood, Kerenza; Little, Paul; Lawton, Michael; Birnie, Kate; Thomas-Jones, Emma; Harman, Kim; Hay, Alastair D

    2017-04-01

    To estimate the cost-effectiveness of a two-step clinical rule using symptoms, signs and dipstick testing to guide the diagnosis and antibiotic treatment of urinary tract infection (UTI) in acutely unwell young children presenting to primary care. Decision analytic model synthesising data from a multicentre, prospective cohort study (DUTY) and the wider literature to estimate the short-term and lifetime costs and healthcare outcomes (symptomatic days, recurrent UTI, quality adjusted life years) of eight diagnostic strategies. We compared GP clinical judgement with three strategies based on a 'coefficient score' combining seven symptoms and signs independently associated with UTI and four strategies based on weighted scores according to the presence/absence of five symptoms and signs. We compared dipstick testing versus laboratory culture in children at intermediate risk of UTI. Sampling, culture and antibiotic costs were lowest in high-specificity DUTY strategies (£1.22 and £1.08) compared to clinical judgement (£1.99). These strategies also approximately halved urine sampling (4.8% versus 9.1% in clinical judgement) without reducing sensitivity (58.2% versus 56.4%). Outcomes were very similar across all diagnostic strategies. High-specificity DUTY strategies were more cost-effective than clinical judgement in the short- (iNMB = £0.78 and £0.84) and long-term (iNMB =£2.31 and £2.50). Dipstick tests had poorer cost-effectiveness than laboratory culture in children at intermediate risk of UTI (iNMB = £-1.41). Compared to GPs' clinical judgement, high specificity clinical rules from the DUTY study could substantially reduce urine sampling, achieving lower costs and equivalent patient outcomes. Dipstick testing children for UTI is not cost-effective. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Research on the influence of institutional pressures on green innovation strategy

    NASA Astrophysics Data System (ADS)

    Zewen, Chen; xin, Li; Hongjun, Cao

    2017-11-01

    Based on the new Institutional theory and the sample of 116 enterprises, this paper explores the influencing factors of green innovation strategy from the perspective of forced pressure, normative pressure and imitation pressure. The results show that the mandatory regulation, the incentive regulation, the supply chain pressure, and the competitive pressure all have a significant and positive impact on the green innovation strategy. Therefore, the government should take steps to stimulate enterprises to choose the green innovation strategy.

  4. Application of the laser induced deflection (LID) technique for low absorption measurements in bulk materials and coatings

    NASA Astrophysics Data System (ADS)

    Triebel, W.; Mühlig, C.; Kufert, S.

    2005-10-01

    Precise absorption measurements of bulk materials and coatings upon pulsed ArF laser irradiation are presented using a compact experimental setup based on the laser induced deflection technique (LID). For absorption measurements of bulk materials the influence of pure bulk and pure surface absorption on the temperature and refractive index profile and thus for the probe beam deflection is analyzed in detail. The separation of bulk and surface absorption via the commonly used variation of the sample thickness is carried out for fused silica and calcium fluoride. The experimental results show that for the given surface polishing quality the bulk absorption coefficient of fused silica can be obtained by investigating only one sample. To avoid the drawback of different bulk and surface properties amongst a thickness series, we propose a strategy based on the LID technique to generally obtain surface and bulk absorption separately by investigating only one sample. Apart from measuring bulk absorption coefficients the LID technique is applied to determine the absorption of highly reflecting (HR) coatings on CaF2 substrates. Beside the measuring strategy the experimental results of a AlF3/LaF3 based HR coating are presented. In order to investigate a larger variety of coatings, including high transmitting coatings, a general measuring strategy based on the LID technique is proposed.

  5. Adding-point strategy for reduced-order hypersonic aerothermodynamics modeling based on fuzzy clustering

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Liu, Li; Zhou, Sida; Yue, Zhenjiang

    2016-09-01

    Reduced order models(ROMs) based on the snapshots on the CFD high-fidelity simulations have been paid great attention recently due to their capability of capturing the features of the complex geometries and flow configurations. To improve the efficiency and precision of the ROMs, it is indispensable to add extra sampling points to the initial snapshots, since the number of sampling points to achieve an adequately accurate ROM is generally unknown in prior, but a large number of initial sampling points reduces the parsimony of the ROMs. A fuzzy-clustering-based adding-point strategy is proposed and the fuzzy clustering acts an indicator of the region in which the precision of ROMs is relatively low. The proposed method is applied to construct the ROMs for the benchmark mathematical examples and a numerical example of hypersonic aerothermodynamics prediction for a typical control surface. The proposed method can achieve a 34.5% improvement on the efficiency than the estimated mean squared error prediction algorithm and shows same-level prediction accuracy.

  6. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    NASA Astrophysics Data System (ADS)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr05839b

  7. Improving the understanding of schistosomiasis among adolescents in endemic areas in Brazil: A comparison of educational methods.

    PubMed

    Gazzinelli, Maria Flávia; Lobato, Lucas; Andrade, Gisele; Matoso, Leonardo Ferreira; Diemert, David J; Gazzinelli, Andréa

    2016-10-01

    To evaluate the effectiveness of two teaching strategies, both guided by the concept of dialogicity, on adolescents' knowledge about schistosomiasis and adherence to diagnostic fecal testing. Two teaching strategies related to schistosomiasis were developed, an educational video and group conversation, which were tested in two groups of students aged 10-15 years old. Before and after the intervention, a questionnaire was applied to assess participants' knowledge about schistosomiasis and, after the intervention, two fecal samples were requested from each participant. Comparisons were performed by paired t- and McNemar tests. Both strategies resulted in statistically significant improvements in knowledge between the pre- and post-tests. Students who watched the video had a higher return rate of fecal samples and percentage of correct questionnaire answers, mainly on questions about schistosomiasis infection. Teaching strategies based on dialogue favored the construction of concepts about schistosomiasis that can influence the adoption of positives attitudes related to health. Using teaching strategies based on the concept of dialogicity can favor the increase of knowledge of school age children about schistosomiasis and can influence behavioral change related to health. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. A simple multi-scale Gaussian smoothing-based strategy for automatic chromatographic peak extraction.

    PubMed

    Fu, Hai-Yan; Guo, Jun-Wei; Yu, Yong-Jie; Li, He-Dong; Cui, Hua-Peng; Liu, Ping-Ping; Wang, Bing; Wang, Sheng; Lu, Peng

    2016-06-24

    Peak detection is a critical step in chromatographic data analysis. In the present work, we developed a multi-scale Gaussian smoothing-based strategy for accurate peak extraction. The strategy consisted of three stages: background drift correction, peak detection, and peak filtration. Background drift correction was implemented using a moving window strategy. The new peak detection method is a variant of the system used by the well-known MassSpecWavelet, i.e., chromatographic peaks are found at local maximum values under various smoothing window scales. Therefore, peaks can be detected through the ridge lines of maximum values under these window scales, and signals that are monotonously increased/decreased around the peak position could be treated as part of the peak. Instrumental noise was estimated after peak elimination, and a peak filtration strategy was performed to remove peaks with signal-to-noise ratios smaller than 3. The performance of our method was evaluated using two complex datasets. These datasets include essential oil samples for quality control obtained from gas chromatography and tobacco plant samples for metabolic profiling analysis obtained from gas chromatography coupled with mass spectrometry. Results confirmed the reasonability of the developed method. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Improving the Understanding of Schistosomiasis among Adolescents in Endemic Areas in Brazil: a Comparison of Educational Methods

    PubMed Central

    Gazzinelli, Maria Flávia; Lobato, Lucas; Andrade, Gisele; Matoso, Leonardo Ferreira; Diemert, David J.; Gazzinelli, Andréa

    2016-01-01

    Objective To evaluate the effectiveness of two teaching strategies, both guided by the concept of dialogicity, on adolescents’ knowledge about schistosomiasis and adherence to diagnostic fecal testing. Methods Two teaching strategies related to schistosomiasis were developed, an educational video and group conversation, which were tested in two groups of students aged 10–15 years old. Before and after the intervention, a questionnaire was applied to assess participants' knowledge about schistosomiasis and, after the intervention, two fecal samples were requested from each participant. Comparisons were performed by paired t- and McNemar tests. Results Both strategies resulted in statistically significant improvements in knowledge between the pre- and post-tests. Students who watched the video had a higher return rate of fecal samples and percentage of correct questionnaire answers, mainly on questions about schistosomiasis infection. Conclusion teaching strategies based on dialogue favored the construction of concepts about schistosomiasis that can influence the adoption of positives attitudes related to health. Practical Implications Using teaching strategies based on the concept of dialogicity can favor the increase of knowledge of school age children about schistosomiasis and can influence behavioral change related to health. PMID:27180618

  10. Advances in stable isotope assisted labeling strategies with information science.

    PubMed

    Kigawa, Takanori

    2017-08-15

    Stable-isotope (SI) labeling of proteins is an essential technique to investigate their structures, interactions or dynamics by nuclear magnetic resonance (NMR) spectroscopy. The assignment of the main-chain signals, which is the fundamental first step in these analyses, is usually achieved by a sequential assignment method based on triple resonance experiments. Independently of the triple resonance experiment-based sequential assignment, amino acid-selective SI labeling is beneficial for discriminating the amino acid type of each signal; therefore, it is especially useful for the signal assignment of difficult targets. Various combinatorial selective labeling schemes have been developed as more sophisticated labeling strategies. In these strategies, amino acids are represented by combinations of SI labeled samples, rather than simply assigning one amino acid to one SI labeled sample as in the case of conventional amino acid-selective labeling. These strategies have proven to be useful for NMR analyses of difficult proteins, such as those in large complex systems, in living cells, attached or integrated into membranes, or with poor solubility. In this review, recent advances in stable isotope assisted labeling strategies will be discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. A Comparison of Two Sampling Strategies to Assess Discomycete Diversity in Wet Tropical Forests

    Treesearch

    SHARON A. CANTRELL

    2004-01-01

    Most of the fungal diversity studies that have used a systematic collecting scheme have not included the discomycetes, so optimal sampling methods are not available for this group. In this study, I tested two sampling methods at each sites in the Caribbean National Forest, Puerto Rico and Ebano Verde Reserve, Dominican Republic. For a plot-based sampling method, 10 ×...

  12. Adaptive sampling strategies with high-throughput molecular dynamics

    NASA Astrophysics Data System (ADS)

    Clementi, Cecilia

    Despite recent significant hardware and software developments, the complete thermodynamic and kinetic characterization of large macromolecular complexes by molecular simulations still presents significant challenges. The high dimensionality of these systems and the complexity of the associated potential energy surfaces (creating multiple metastable regions connected by high free energy barriers) does not usually allow to adequately sample the relevant regions of their configurational space by means of a single, long Molecular Dynamics (MD) trajectory. Several different approaches have been proposed to tackle this sampling problem. We focus on the development of ensemble simulation strategies, where data from a large number of weakly coupled simulations are integrated to explore the configurational landscape of a complex system more efficiently. Ensemble methods are of increasing interest as the hardware roadmap is now mostly based on increasing core counts, rather than clock speeds. The main challenge in the development of an ensemble approach for efficient sampling is in the design of strategies to adaptively distribute the trajectories over the relevant regions of the systems' configurational space, without using any a priori information on the system global properties. We will discuss the definition of smart adaptive sampling approaches that can redirect computational resources towards unexplored yet relevant regions. Our approaches are based on new developments in dimensionality reduction for high dimensional dynamical systems, and optimal redistribution of resources. NSF CHE-1152344, NSF CHE-1265929, Welch Foundation C-1570.

  13. A comparison of sample preparation strategies for biological tissues and subsequent trace element analysis using LA-ICP-MS.

    PubMed

    Bonta, Maximilian; Török, Szilvia; Hegedus, Balazs; Döme, Balazs; Limbeck, Andreas

    2017-03-01

    Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) is one of the most commonly applied methods for lateral trace element distribution analysis in medical studies. Many improvements of the technique regarding quantification and achievable lateral resolution have been achieved in the last years. Nevertheless, sample preparation is also of major importance and the optimal sample preparation strategy still has not been defined. While conventional histology knows a number of sample pre-treatment strategies, little is known about the effect of these approaches on the lateral distributions of elements and/or their quantities in tissues. The technique of formalin fixation and paraffin embedding (FFPE) has emerged as the gold standard in tissue preparation. However, the potential use for elemental distribution studies is questionable due to a large number of sample preparation steps. In this work, LA-ICP-MS was used to examine the applicability of the FFPE sample preparation approach for elemental distribution studies. Qualitative elemental distributions as well as quantitative concentrations in cryo-cut tissues as well as FFPE samples were compared. Results showed that some metals (especially Na and K) are severely affected by the FFPE process, whereas others (e.g., Mn, Ni) are less influenced. Based on these results, a general recommendation can be given: FFPE samples are completely unsuitable for the analysis of alkaline metals. When analyzing transition metals, FFPE samples can give comparable results to snap-frozen tissues. Graphical abstract Sample preparation strategies for biological tissues are compared with regard to the elemental distributions and average trace element concentrations.

  14. Towards a sampling strategy for the assessment of forest condition at European level: combining country estimates.

    PubMed

    Travaglini, Davide; Fattorini, Lorenzo; Barbati, Anna; Bottalico, Francesca; Corona, Piermaria; Ferretti, Marco; Chirici, Gherardo

    2013-04-01

    A correct characterization of the status and trend of forest condition is essential to support reporting processes at national and international level. An international forest condition monitoring has been implemented in Europe since 1987 under the auspices of the International Co-operative Programme on Assessment and Monitoring of Air Pollution Effects on Forests (ICP Forests). The monitoring is based on harmonized methodologies, with individual countries being responsible for its implementation. Due to inconsistencies and problems in sampling design, however, the ICP Forests network is not able to produce reliable quantitative estimates of forest condition at European and sometimes at country level. This paper proposes (1) a set of requirements for status and change assessment and (2) a harmonized sampling strategy able to provide unbiased and consistent estimators of forest condition parameters and of their changes at both country and European level. Under the assumption that a common definition of forest holds among European countries, monitoring objectives, parameters of concern and accuracy indexes are stated. On the basis of fixed-area plot sampling performed independently in each country, an unbiased and consistent estimator of forest defoliation indexes is obtained at both country and European level, together with conservative estimators of their sampling variance and power in the detection of changes. The strategy adopts a probabilistic sampling scheme based on fixed-area plots selected by means of systematic or stratified schemes. Operative guidelines for its application are provided.

  15. Effectiveness of a School-Based Yoga Program on Adolescent Mental Health, Stress Coping Strategies, and Attitudes toward Violence: Findings from a High-Risk Sample

    ERIC Educational Resources Information Center

    Frank, Jennifer L.; Bose, Bidyut; Schrobenhauser-Clonan, Alex

    2014-01-01

    This study aimed to assess the effectiveness of a universal yoga-based social-emotional wellness promotion program, Transformative Life Skills, on indicators of adolescent emotional distress, prosocial behavior, and attitudes toward violence in a high-risk sample. Participants included 49 students attending an alternative education school in an…

  16. Sampling of temporal networks: Methods and biases

    NASA Astrophysics Data System (ADS)

    Rocha, Luis E. C.; Masuda, Naoki; Holme, Petter

    2017-11-01

    Temporal networks have been increasingly used to model a diversity of systems that evolve in time; for example, human contact structures over which dynamic processes such as epidemics take place. A fundamental aspect of real-life networks is that they are sampled within temporal and spatial frames. Furthermore, one might wish to subsample networks to reduce their size for better visualization or to perform computationally intensive simulations. The sampling method may affect the network structure and thus caution is necessary to generalize results based on samples. In this paper, we study four sampling strategies applied to a variety of real-life temporal networks. We quantify the biases generated by each sampling strategy on a number of relevant statistics such as link activity, temporal paths and epidemic spread. We find that some biases are common in a variety of networks and statistics, but one strategy, uniform sampling of nodes, shows improved performance in most scenarios. Given the particularities of temporal network data and the variety of network structures, we recommend that the choice of sampling methods be problem oriented to minimize the potential biases for the specific research questions on hand. Our results help researchers to better design network data collection protocols and to understand the limitations of sampled temporal network data.

  17. Nontargeted Screening Method for Illegal Additives Based on Ultrahigh-Performance Liquid Chromatography-High-Resolution Mass Spectrometry.

    PubMed

    Fu, Yanqing; Zhou, Zhihui; Kong, Hongwei; Lu, Xin; Zhao, Xinjie; Chen, Yihui; Chen, Jia; Wu, Zeming; Xu, Zhiliang; Zhao, Chunxia; Xu, Guowang

    2016-09-06

    Identification of illegal additives in complex matrixes is important in the food safety field. In this study a nontargeted screening strategy was developed to find illegal additives based on ultrahigh-performance liquid chromatography-high-resolution mass spectrometry (UHPLC-HRMS). First, an analytical method for possible illegal additives in complex matrixes was established including fast sample pretreatment, accurate UHPLC separation, and HRMS detection. Second, efficient data processing and differential analysis workflow were suggested and applied to find potential risk compounds. Third, structure elucidation of risk compounds was performed by (1) searching online databases [Metlin and the Human Metabolome Database (HMDB)] and an in-house database which was established at the above-defined conditions of UHPLC-HRMS analysis and contains information on retention time, mass spectra (MS), and tandem mass spectra (MS/MS) of 475 illegal additives, (2) analyzing fragment ions, and (3) referring to fragmentation rules. Fish was taken as an example to show the usefulness of the nontargeted screening strategy, and six additives were found in suspected fish samples. Quantitative analysis was further carried out to determine the contents of these compounds. The satisfactory application of this strategy in fish samples means that it can also be used in the screening of illegal additives in other kinds of food samples.

  18. Generative Learning Strategy Use and Self-Regulatory Prompting in Digital Text

    ERIC Educational Resources Information Center

    Reid, Alan J.; Morrison, Gary M.

    2014-01-01

    The digital revolution is shifting print-based textbooks to digital text, and it has afforded the opportunity to incorporate meaningful learning strategies and otherwise separate metacognitive activities directly into these texts as embedded support. A sample of 89 undergraduates read a digital, expository text on the basics of photography. The…

  19. Coping Strategies for Managing Acculturative Stress among Asian International Students

    ERIC Educational Resources Information Center

    Ra, Young-An; Trusty, Jerry

    2015-01-01

    This article examines the effects of specific coping strategies on managing acculturative stress and acculturation of Asian international students, based on a sample of 220 Asian international students in the U.S. The data were analyzed with hierarchical multiple regression using Baron and Kenny's (1986) mediation procedure. The results supported…

  20. "FearNot!": A Computer-Based Anti-Bullying-Programme Designed to Foster Peer Intervention

    ERIC Educational Resources Information Center

    Vannini, Natalie; Enz, Sibylle; Sapouna, Maria; Wolke, Dieter; Watson, Scott; Woods, Sarah; Dautenhahn, Kerstin; Hall, Lynne; Paiva, Ana; Andre, Elizabeth; Aylett, Ruth; Schneider, Wolfgang

    2011-01-01

    Bullying is widespread in European schools, despite multiple intervention strategies having been proposed over the years. The present study investigates the effects of a novel virtual learning strategy ("FearNot!") to tackle bullying in both UK and German samples. The approach is intended primarily for victims to increase their coping…

  1. Pre-Service and In-Service Teachers' Metacognitive Knowledge about Problem-Solving Strategies

    ERIC Educational Resources Information Center

    Metallidou, Panayiota

    2009-01-01

    The present study based on Antonietti, A., Ignazi, S., & Perego, P. (2000). Metacognitive knowledge about problem-solving methods. "British Journal of Educational Psychology, 70", 1-16 methodology with the aim to examine primary school teachers' metacognitive knowledge about problem-solving strategies. A sample of 338 in-service (172) and…

  2. An Internal Standard for Assessing Phosphopeptide Recovery from Metal Ion/Oxide Enrichment Strategies

    NASA Astrophysics Data System (ADS)

    Paulo, Joao A.; Navarrete-Perea, Jose; Erickson, Alison R.; Knott, Jeffrey; Gygi, Steven P.

    2018-04-01

    Phosphorylation-mediated signaling pathways have major implications in cellular regulation and disease. However, proteins with roles in these pathways are frequently less abundant and phosphorylation is often sub-stoichiometric. As such, the efficient enrichment, and subsequent recovery of phosphorylated peptides, is vital. Mass spectrometry-based proteomics is a well-established approach for quantifying thousands of phosphorylation events in a single experiment. We designed a peptide internal standard-based assay directed toward sample preparation strategies for mass spectrometry analysis to understand better phosphopeptide recovery from enrichment strategies. We coupled mass-differential tandem mass tag (mTMT) reagents (specifically, TMTzero and TMTsuper-heavy), nine mass spectrometry-amenable phosphopeptides (phos9), and peak area measurements from extracted ion chromatograms to determine phosphopeptide recovery. We showcase this mTMT/phos9 recovery assay by evaluating three phosphopeptide enrichment workflows. Our assay provides data on the recovery of phosphopeptides, which complement other metrics, namely the number of identified phosphopeptides and enrichment specificity. Our mTMT/phos9 assay is applicable to any enrichment protocol in a typical experimental workflow irrespective of sample origin or labeling strategy. [Figure not available: see fulltext.

  3. A fosmid cloning strategy for detecting the widest possible spectrum of microbes from the international space station drinking water system.

    PubMed

    Choi, Sangdun; Chang, Mi Sook; Stuecker, Tara; Chung, Christine; Newcombe, David A; Venkateswaran, Kasthuri

    2012-12-01

    In this study, fosmid cloning strategies were used to assess the microbial populations in water from the International Space Station (ISS) drinking water system (henceforth referred to as Prebiocide and Tank A water samples). The goals of this study were: to compare the sensitivity of the fosmid cloning strategy with that of traditional culture-based and 16S rRNA-based approaches and to detect the widest possible spectrum of microbial populations during the water purification process. Initially, microbes could not be cultivated, and conventional PCR failed to amplify 16S rDNA fragments from these low biomass samples. Therefore, randomly primed rolling-circle amplification was used to amplify any DNA that might be present in the samples, followed by size selection by using pulsed-field gel electrophoresis. The amplified high-molecular-weight DNA from both samples was cloned into fosmid vectors. Several hundred clones were randomly selected for sequencing, followed by Blastn/Blastx searches. Sequences encoding specific genes from Burkholderia, a species abundant in the soil and groundwater, were found in both samples. Bradyrhizobium and Mesorhizobium, which belong to rhizobia, a large community of nitrogen fixers often found in association with plant roots, were present in the Prebiocide samples. Ralstonia, which is prevalent in soils with a high heavy metal content, was detected in the Tank A samples. The detection of many unidentified sequences suggests the presence of potentially novel microbial fingerprints. The bacterial diversity detected in this pilot study using a fosmid vector approach was higher than that detected by conventional 16S rRNA gene sequencing.

  4. Efficiency enhancement of optimized Latin hypercube sampling strategies: Application to Monte Carlo uncertainty analysis and meta-modeling

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad; Janssen, Hans

    2015-02-01

    The majority of literature regarding optimized Latin hypercube sampling (OLHS) is devoted to increasing the efficiency of these sampling strategies through the development of new algorithms based on the combination of innovative space-filling criteria and specialized optimization schemes. However, little attention has been given to the impact of the initial design that is fed into the optimization algorithm, on the efficiency of OLHS strategies. Previous studies, as well as codes developed for OLHS, have relied on one of the following two approaches for the selection of the initial design in OLHS: (1) the use of random points in the hypercube intervals (random LHS), and (2) the use of midpoints in the hypercube intervals (midpoint LHS). Both approaches have been extensively used, but no attempt has been previously made to compare the efficiency and robustness of their resulting sample designs. In this study we compare the two approaches and show that the space-filling characteristics of OLHS designs are sensitive to the initial design that is fed into the optimization algorithm. It is also illustrated that the space-filling characteristics of OLHS designs based on midpoint LHS are significantly better those based on random LHS. The two approaches are compared by incorporating their resulting sample designs in Monte Carlo simulation (MCS) for uncertainty propagation analysis, and then, by employing the sample designs in the selection of the training set for constructing non-intrusive polynomial chaos expansion (NIPCE) meta-models which subsequently replace the original full model in MCSs. The analysis is based on two case studies involving numerical simulation of density dependent flow and solute transport in porous media within the context of seawater intrusion in coastal aquifers. We show that the use of midpoint LHS as the initial design increases the efficiency and robustness of the resulting MCSs and NIPCE meta-models. The study also illustrates that this relative improvement decreases with increasing number of sample points and input parameter dimensions. Since the computational time and efforts for generating the sample designs in the two approaches are identical, the use of midpoint LHS as the initial design in OLHS is thus recommended.

  5. Innovative recruitment using online networks: lessons learned from an online study of alcohol and other drug use utilizing a web-based, respondent-driven sampling (webRDS) strategy.

    PubMed

    Bauermeister, José A; Zimmerman, Marc A; Johns, Michelle M; Glowacki, Pietreck; Stoddard, Sarah; Volz, Erik

    2012-09-01

    We used a web version of Respondent-Driven Sampling (webRDS) to recruit a sample of young adults (ages 18-24) and examined whether this strategy would result in alcohol and other drug (AOD) prevalence estimates comparable to national estimates (National Survey on Drug Use and Health [NSDUH]). We recruited 22 initial participants (seeds) via Facebook to complete a web survey examining AOD risk correlates. Sequential, incentivized recruitment continued until our desired sample size was achieved. After correcting for webRDS clustering effects, we contrasted our AOD prevalence estimates (past 30 days) to NSDUH estimates by comparing the 95% confidence intervals of prevalence estimates. We found comparable AOD prevalence estimates between our sample and NSDUH for the past 30 days for alcohol, marijuana, cocaine, Ecstasy (3,4-methylenedioxymethamphetamine, or MDMA), and hallucinogens. Cigarette use was lower than NSDUH estimates. WebRDS may be a suitable strategy to recruit young adults online. We discuss the unique strengths and challenges that may be encountered by public health researchers using webRDS methods.

  6. Sample preparation composite and replicate strategy case studies for assay of solid oral drug products.

    PubMed

    Nickerson, Beverly; Harrington, Brent; Li, Fasheng; Guo, Michele Xuemei

    2017-11-30

    Drug product assay is one of several tests required for new drug products to ensure the quality of the product at release and throughout the life cycle of the product. Drug product assay testing is typically performed by preparing a composite sample of multiple dosage units to obtain an assay value representative of the batch. In some cases replicate composite samples may be prepared and the reportable assay value is the average value of all the replicates. In previously published work by Harrington et al. (2014) [5], a sample preparation composite and replicate strategy for assay was developed to provide a systematic approach which accounts for variability due to the analytical method and dosage form with a standard error of the potency assay criteria based on compendia and regulatory requirements. In this work, this sample preparation composite and replicate strategy for assay is applied to several case studies to demonstrate the utility of this approach and its application at various stages of pharmaceutical drug product development. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A DNA logic gate based on strand displacement reaction and rolling circle amplification, responding to multiple low-abundance DNA fragment input signals, and its application in detecting miRNAs.

    PubMed

    Chen, Yuqi; Song, Yanyan; Wu, Fan; Liu, Wenting; Fu, Boshi; Feng, Bingkun; Zhou, Xiang

    2015-04-25

    A conveniently amplified DNA AND logic gate platform was designed for the highly sensitive detection of low-abundance DNA fragment inputs based on strand displacement reaction and rolling circle amplification strategy. Compared with others, this system can detect miRNAs in biological samples. The success of this strategy demonstrates the potential of DNA logic gates in disease diagnosis.

  8. TU-AB-BRC-11: Moving a GPU-OpenCL-Based Monte Carlo (MC) Dose Engine Towards Routine Clinical Use: Automatic Beam Commissioning and Efficient Source Sampling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z; Folkerts, M; Jiang, S

    Purpose: We have previously developed a GPU-OpenCL-based MC dose engine named goMC with built-in analytical linac beam model. To move goMC towards routine clinical use, we have developed an automatic beam-commissioning method, and an efficient source sampling strategy to facilitate dose calculations for real treatment plans. Methods: Our commissioning method is to automatically adjust the relative weights among the sub-sources, through an optimization process minimizing the discrepancies between calculated dose and measurements. Six models built for Varian Truebeam linac photon beams (6MV, 10MV, 15MV, 18MV, 6MVFFF, 10MVFFF) were commissioned using measurement data acquired at our institution. To facilitate dose calculationsmore » for real treatment plans, we employed inverse sampling method to efficiently incorporate MLC leaf-sequencing into source sampling. Specifically, instead of sampling source particles control-point by control-point and rejecting the particles blocked by MLC, we assigned a control-point index to each sampled source particle, according to MLC leaf-open duration of each control-point at the pixel where the particle intersects the iso-center plane. Results: Our auto-commissioning method decreased distance-to-agreement (DTA) of depth dose at build-up regions by 36.2% averagely, making it within 1mm. Lateral profiles were better matched for all beams, with biggest improvement found at 15MV for which root-mean-square difference was reduced from 1.44% to 0.50%. Maximum differences of output factors were reduced to less than 0.7% for all beams, with largest decrease being from1.70% to 0.37% found at 10FFF. Our new sampling strategy was tested on a Head&Neck VMAT patient case. Achieving clinically acceptable accuracy, the new strategy could reduce the required history number by a factor of ∼2.8 given a statistical uncertainty level and hence achieve a similar speed-up factor. Conclusion: Our studies have demonstrated the feasibility and effectiveness of our auto-commissioning approach and new efficient source sampling strategy, implying the potential of our GPU-based MC dose engine goMC for routine clinical use.« less

  9. The Human Microbiome Project strategy for comprehensive sampling of the human microbiome and why it matters

    PubMed Central

    Aagaard, Kjersti; Petrosino, Joseph; Keitel, Wendy; Watson, Mark; Katancik, James; Garcia, Nathalia; Patel, Shital; Cutting, Mary; Madden, Tessa; Hamilton, Holli; Harris, Emily; Gevers, Dirk; Simone, Gina; McInnes, Pamela; Versalovic, James

    2013-01-01

    The Human Microbiome Project used rigorous good clinical practice standards to complete comprehensive body site sampling in healthy 18- to 40-yr-old adults, creating an unparalleled reference set of microbiome specimens. To ensure that specimens represented minimally perturbed microbiomes, we first screened potential participants using exclusion criteria based on health history, including the presence of systemic diseases (e.g., hypertension, cancer, or immunodeficiency or autoimmune disorders), use of potential immunomodulators, and recent use of antibiotics or probiotics. Subsequent physical examinations excluded individuals based on body mass index (BMI), cutaneous lesions, and oral health. We screened 554 individuals to enroll 300 (149 men and 151 women, mean age 26 yr, mean BMI 24 kg/m2, 20.0% racial minority, and 10.7% Hispanic). We obtained specimens from the oral cavity, nares, skin, gastrointestinal tract, and vagina (15 specimens from men and 18 from women). The study evaluated longitudinal changes in an individual's microbiome by sampling 279 participants twice (mean 212 d after the first sampling; range 30-359 d) and 100 individuals 3 times (mean 72 d after the second sampling; range 30-224 d). This sampling strategy yielded 11,174 primary specimens, from which 12,479 DNA samples were submitted to 4 centers for metagenomic sequencing. Our clinical design and well-defined reference cohort has laid a foundation for microbiome research.—Aagaard, K., Petrosino, J., Keitel, W., Watson, M., Katancik, J., Garcia, N., Patel, S., Cutting, M., Madden, T., Hamilton, H., Harris, E., Gevers, D., Simone, G., McInnes, P., Versalovic, J. The Human Microbiome Project strategy for comprehensive sampling of the human microbiome and why it matters. PMID:23165986

  10. Theory of sampling: four critical success factors before analysis.

    PubMed

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  11. Strategy to obtain axenic cultures from field-collected samples of the cyanobacterium Phormidium animalis.

    PubMed

    Vázquez-Martínez, Guadalupe; Rodriguez, Mario H; Hernández-Hernández, Fidel; Ibarra, Jorge E

    2004-04-01

    An efficient strategy, based on a combination of procedures, was developed to obtain axenic cultures from field-collected samples of the cyanobacterium Phormidium animalis. Samples were initially cultured in solid ASN-10 medium, and a crude separation of major contaminants from P. animalis filaments was achieved by washing in a series of centrifugations and resuspensions in liquid medium. Then, manageable filament fragments were obtained by probe sonication. Fragmentation was followed by forceful washing, using vacuum-driven filtration through an 8-microm pore size membrane and an excess of water. Washed fragments were cultured and treated with a sequential exposure to four different antibiotics. Finally, axenic cultures were obtained from serial dilutions of treated fragments. Monitoring under microscope examination and by inoculation in Luria-Bertani (LB) agar plates indicated either axenicity or the degree of contamination throughout the strategy.

  12. "Shoot and Sense" Janus Micromotors-Based Strategy for the Simultaneous Degradation and Detection of Persistent Organic Pollutants in Food and Biological Samples.

    PubMed

    Rojas, D; Jurado-Sánchez, B; Escarpa, A

    2016-04-05

    A novel Janus micromotor-based strategy for the direct determination of diphenyl phthalate (DPP) in food and biological samples is presented. Mg/Au Janus micromotors are employed as novel analytical platforms for the degradation of the non-electroactive DPP into phenol, which is directly measured by difference pulse voltammetry on disposable screen-printed electrodes. The self-movement of the micromotors along the samples result in the generation of hydrogen microbubbles and hydroxyl ions for DPP degradation. The increased fluid transport improves dramatically the analytical signal, increasing the sensitivity while lowering the detection potential. The method has been successfully applied to the direct analysis of DPP in selected food and biological samples, without any sample treatment and avoiding any potential contamination from laboratory equipment. The developed approach is fast (∼5 min) and accurate with recoveries of ∼100%. In addition, efficient propulsion of multiple Mg/Au micromotors in complex samples has also been demonstrated. The advantages of the micromotors-assisted technology, i.e., disposability, portability, and the possibility to carry out multiple analysis simultaneously, hold considerable promise for its application in food and biological control in analytical applications with high significance.

  13. Sample size determination for bibliographic retrieval studies

    PubMed Central

    Yao, Xiaomei; Wilczynski, Nancy L; Walter, Stephen D; Haynes, R Brian

    2008-01-01

    Background Research for developing search strategies to retrieve high-quality clinical journal articles from MEDLINE is expensive and time-consuming. The objective of this study was to determine the minimal number of high-quality articles in a journal subset that would need to be hand-searched to update or create new MEDLINE search strategies for treatment, diagnosis, and prognosis studies. Methods The desired width of the 95% confidence intervals (W) for the lowest sensitivity among existing search strategies was used to calculate the number of high-quality articles needed to reliably update search strategies. New search strategies were derived in journal subsets formed by 2 approaches: random sampling of journals and top journals (having the most high-quality articles). The new strategies were tested in both the original large journal database and in a low-yielding journal (having few high-quality articles) subset. Results For treatment studies, if W was 10% or less for the lowest sensitivity among our existing search strategies, a subset of 15 randomly selected journals or 2 top journals were adequate for updating search strategies, based on each approach having at least 99 high-quality articles. The new strategies derived in 15 randomly selected journals or 2 top journals performed well in the original large journal database. Nevertheless, the new search strategies developed using the random sampling approach performed better than those developed using the top journal approach in a low-yielding journal subset. For studies of diagnosis and prognosis, no journal subset had enough high-quality articles to achieve the expected W (10%). Conclusion The approach of randomly sampling a small subset of journals that includes sufficient high-quality articles is an efficient way to update or create search strategies for high-quality articles on therapy in MEDLINE. The concentrations of diagnosis and prognosis articles are too low for this approach. PMID:18823538

  14. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  15. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  16. Least squares polynomial chaos expansion: A review of sampling strategies

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Doostan, Alireza

    2018-04-01

    As non-institutive polynomial chaos expansion (PCE) techniques have gained growing popularity among researchers, we here provide a comprehensive review of major sampling strategies for the least squares based PCE. Traditional sampling methods, such as Monte Carlo, Latin hypercube, quasi-Monte Carlo, optimal design of experiments (ODE), Gaussian quadratures, as well as more recent techniques, such as coherence-optimal and randomized quadratures are discussed. We also propose a hybrid sampling method, dubbed alphabetic-coherence-optimal, that employs the so-called alphabetic optimality criteria used in the context of ODE in conjunction with coherence-optimal samples. A comparison between the empirical performance of the selected sampling methods applied to three numerical examples, including high-order PCE's, high-dimensional problems, and low oversampling ratios, is presented to provide a road map for practitioners seeking the most suitable sampling technique for a problem at hand. We observed that the alphabetic-coherence-optimal technique outperforms other sampling methods, specially when high-order ODE are employed and/or the oversampling ratio is low.

  17. Rapid Characterization of Constituents in Tribulus terrestris from Different Habitats by UHPLC/Q-TOF MS.

    PubMed

    Zheng, Wei; Wang, Fangxu; Zhao, Yang; Sun, Xinguang; Kang, Liping; Fan, Ziquan; Qiao, Lirui; Yan, Renyi; Liu, Shuchen; Ma, Baiping

    2017-11-01

    A strategy for rapid identification of the chemical constituents from crude extracts of Tribulus terrestris was proposed using an informatics platform for the UHPLC/Q-TOF MS E data analyses. This strategy mainly utilizes neutral losses, characteristic fragments, and in-house library to rapidly identify the structure of the compounds. With this strategy, rapid characterization of the chemical components of T. terrestris from Beijing, China was successfully achieved. A total of 82 steroidal saponins and nine flavonoids were identified or tentatively identified from T. terrestris. Among them, 15 new components were deduced based on retention times and characteristic MS fragmentation patterns. Furthermore, the chemical components of T. terrestris, including the other two samples from Xinjiang Uygur Autonomous region, China, and Rome, Italy, were also identified with this strategy. Altogether, 141 chemical components were identified from these three samples, of which 39 components were identified or tentatively identified as new compounds, including 35 groups of isomers. It demonstrated that this strategy provided an efficient protocol for the rapid identification of chemical constituents in complex samples such as traditional Chinese medicines (TCMs) by UHPLC/Q-TOF MS E with informatics platform. Graphical Abstract ᅟ.

  18. Rapid Characterization of Constituents in Tribulus terrestris from Different Habitats by UHPLC/Q-TOF MS

    NASA Astrophysics Data System (ADS)

    Zheng, Wei; Wang, Fangxu; Zhao, Yang; Sun, Xinguang; Kang, Liping; Fan, Ziquan; Qiao, Lirui; Yan, Renyi; Liu, Shuchen; Ma, Baiping

    2017-08-01

    A strategy for rapid identification of the chemical constituents from crude extracts of Tribulus terrestris was proposed using an informatics platform for the UHPLC/Q-TOF MSE data analyses. This strategy mainly utilizes neutral losses, characteristic fragments, and in-house library to rapidly identify the structure of the compounds. With this strategy, rapid characterization of the chemical components of T. terrestris from Beijing, China was successfully achieved. A total of 82 steroidal saponins and nine flavonoids were identified or tentatively identified from T. terrestris. Among them, 15 new components were deduced based on retention times and characteristic MS fragmentation patterns. Furthermore, the chemical components of T. terrestris, including the other two samples from Xinjiang Uygur Autonomous region, China, and Rome, Italy, were also identified with this strategy. Altogether, 141 chemical components were identified from these three samples, of which 39 components were identified or tentatively identified as new compounds, including 35 groups of isomers. It demonstrated that this strategy provided an efficient protocol for the rapid identification of chemical constituents in complex samples such as traditional Chinese medicines (TCMs) by UHPLC/Q-TOF MSE with informatics platform. [Figure not available: see fulltext.

  19. Establishing the ACORN National Practitioner Database: Strategies to Recruit Practitioners to a National Practice-Based Research Network.

    PubMed

    Adams, Jon; Steel, Amie; Moore, Craig; Amorin-Woods, Lyndon; Sibbritt, David

    2016-10-01

    The purpose of this paper is to report on the recruitment and promotion strategies employed by the Australian Chiropractic Research Network (ACORN) project aimed at helping recruit a substantial national sample of participants and to describe the features of our practice-based research network (PBRN) design that may provide key insights to others looking to establish a similar network or draw on the ACORN project to conduct sub-studies. The ACORN project followed a multifaceted recruitment and promotion strategy drawing on distinct branding, a practitioner-focused promotion campaign, and a strategically designed questionnaire and distribution/recruitment approach to attract sufficient participation from the ranks of registered chiropractors across Australia. From the 4684 chiropractors registered at the time of recruitment, the project achieved a database response rate of 36% (n = 1680), resulting in a large, nationally representative sample across age, gender, and location. This sample constitutes the largest proportional coverage of participants from any voluntary national PBRN across any single health care profession. It does appear that a number of key promotional and recruitment features of the ACORN project may have helped establish the high response rate for the PBRN, which constitutes an important sustainable resource for future national and international efforts to grow the chiropractic evidence base and research capacity. Further rigorous enquiry is needed to help evaluate the direct contribution of specific promotional and recruitment strategies in attaining high response rates from practitioner populations who may be invited to participate in future PBRNs. Copyright © 2016. Published by Elsevier Inc.

  20. Empowering Students to Write and Re-Write: Standards-Based Strategies for Middle and High School Teachers

    ERIC Educational Resources Information Center

    Combs, Warren E.

    2009-01-01

    In this book, the author provides teachers with detailed strategies and lesson plans, along with real student writing samples. He describes effective routines of formative self-assessment, and shows teachers how to form a professional learning team with their colleagues using the 6-session professional learning guide. Contents include: (1)…

  1. Categories and Underlying Processes, or Representative Behavior Samples and S-R Analysis: Opposing Strategies.

    ERIC Educational Resources Information Center

    Staats, Arthur W.

    Psychological researchers should deal with the concrete stimulus-response principles of learning on which behavior is based, and study behaviors that are representative of real life behaviors. The present research strategy has come from two faulty ideas: first, a concern with underlying, inferred mental processes, rather than with actual tasks or…

  2. Youth Audience Segmentation Strategies for Smoking-Prevention Mass Media Campaigns Based on Message Appeal

    ERIC Educational Resources Information Center

    Flynn, Brian S.; Worden, John K.; Bunn, Janice Yanushka; Dorwaldt, Anne L.; Connolly, Scott W.; Ashikaga, Takamaru

    2007-01-01

    Mass media interventions are among the strategies recommended for youth cigarette smoking prevention, but little is known about optimal methods for reaching diverse youth audiences. Grades 4 through 12 samples of youth from four states (n = 1,230) rated smoking-prevention messages in classroom settings. Similar proportions of African American,…

  3. Providers' Reported and Actual Use of Coaching Strategies in Natural Environments

    ERIC Educational Resources Information Center

    Salisbury, Christine; Cambray-Engstrom, Elizabeth; Woods, Juliann

    2012-01-01

    This case study examined the agreement between reported and actual use of coaching strategies based on home visit data collected on a diverse sample of providers and families. Paired videotape and contact note data of and from providers during home visits were collected over a six month period and analyzed using structured protocols. Results of…

  4. Exploring K-3 Teachers' Implementation of Comprehension Strategy Instruction (CSI) Using Expectancy-Value Theory

    ERIC Educational Resources Information Center

    Foley, Laura S.

    2011-01-01

    This research investigated factors that influence the implementation levels of evidence-based comprehension strategy instruction (CSI) among K-3 teachers. An explanatory design was chosen to gather and probe the data. Quantitative data were gathered via a mailed survey distributed through a representative sample of the 40 school districts (through…

  5. Pharmacokinetic Modeling and Limited Sampling Strategies Based on Healthy Volunteers for Monitoring of Ertapenem in Patients with Multidrug-Resistant Tuberculosis.

    PubMed

    van Rijn, S P; Zuur, M A; van Altena, R; Akkerman, O W; Proost, J H; de Lange, W C M; Kerstjens, H A M; Touw, D J; van der Werf, T S; Kosterink, J G W; Alffenaar, J W C

    2017-04-01

    Ertapenem is a broad-spectrum carbapenem antibiotic whose activity against Mycobacterium tuberculosis is being explored. Carbapenems have antibacterial activity when the plasma concentration exceeds the MIC at least 40% of the time (40% T MIC ). To assess the 40% T MIC in multidrug-resistant tuberculosis (MDR-TB) patients, a limited sampling strategy was developed using a population pharmacokinetic model based on data for healthy volunteers. A two-compartment population pharmacokinetic model was developed with data for 42 healthy volunteers using an iterative two-stage Bayesian method. External validation was performed by Bayesian fitting of the model developed with data for volunteers to the data for individual MDR-TB patients (in which the fitted values of the area under the concentration-time curve from 0 to 24 h [AUC 0-24, fit values] were used) using the population model developed for volunteers as a prior. A Monte Carlo simulation ( n = 1,000) was used to evaluate limited sampling strategies. Additionally, the 40% T MIC with the free fraction ( f 40% T MIC ) of ertapenem in MDR-TB patients was estimated with the population pharmacokinetic model. The population pharmacokinetic model that was developed was shown to overestimate the area under the concentration-time curve from 0 to 24 h (AUC 0-24 ) in MDR-TB patients by 6.8% (range, -17.2 to 30.7%). The best-performing limited sampling strategy, which had a time restriction of 0 to 6 h, was found to be sampling at 1 and 5 h ( r 2 = 0.78, mean prediction error = -0.33%, root mean square error = 5.5%). Drug exposure was overestimated by a mean percentage of 4.2% (range, -15.2 to 23.6%). When a free fraction of 5% was considered and the MIC was set at 0.5 mg/liter, the minimum f 40% T MIC would have been exceeded in 9 out of 12 patients. A population pharmacokinetic model and limited sampling strategy, developed using data from healthy volunteers, were shown to be adequate to predict ertapenem exposure in MDR-TB patients. Copyright © 2017 American Society for Microbiology.

  6. Pharmacokinetic Modeling and Limited Sampling Strategies Based on Healthy Volunteers for Monitoring of Ertapenem in Patients with Multidrug-Resistant Tuberculosis

    PubMed Central

    van Rijn, S. P.; Zuur, M. A.; van Altena, R.; Akkerman, O. W.; Proost, J. H.; de Lange, W. C. M.; Kerstjens, H. A. M.; Touw, D. J.; van der Werf, T. S.; Kosterink, J. G. W.

    2017-01-01

    ABSTRACT Ertapenem is a broad-spectrum carbapenem antibiotic whose activity against Mycobacterium tuberculosis is being explored. Carbapenems have antibacterial activity when the plasma concentration exceeds the MIC at least 40% of the time (40% TMIC). To assess the 40% TMIC in multidrug-resistant tuberculosis (MDR-TB) patients, a limited sampling strategy was developed using a population pharmacokinetic model based on data for healthy volunteers. A two-compartment population pharmacokinetic model was developed with data for 42 healthy volunteers using an iterative two-stage Bayesian method. External validation was performed by Bayesian fitting of the model developed with data for volunteers to the data for individual MDR-TB patients (in which the fitted values of the area under the concentration-time curve from 0 to 24 h [AUC0–24, fit values] were used) using the population model developed for volunteers as a prior. A Monte Carlo simulation (n = 1,000) was used to evaluate limited sampling strategies. Additionally, the 40% TMIC with the free fraction (f 40% TMIC) of ertapenem in MDR-TB patients was estimated with the population pharmacokinetic model. The population pharmacokinetic model that was developed was shown to overestimate the area under the concentration-time curve from 0 to 24 h (AUC0–24) in MDR-TB patients by 6.8% (range, −17.2 to 30.7%). The best-performing limited sampling strategy, which had a time restriction of 0 to 6 h, was found to be sampling at 1 and 5 h (r2 = 0.78, mean prediction error = −0.33%, root mean square error = 5.5%). Drug exposure was overestimated by a mean percentage of 4.2% (range, −15.2 to 23.6%). When a free fraction of 5% was considered and the MIC was set at 0.5 mg/liter, the minimum f 40% TMIC would have been exceeded in 9 out of 12 patients. A population pharmacokinetic model and limited sampling strategy, developed using data from healthy volunteers, were shown to be adequate to predict ertapenem exposure in MDR-TB patients. PMID:28137814

  7. Community-Based Study Recruitment of American Indian Cigarette Smokers and Electronic Cigarette Users.

    PubMed

    Carroll, Dana Mowls; Brame, Lacy S; Stephens, Lancer D; Wagener, Theodore L; Campbell, Janis E; Beebe, Laura A

    2018-02-01

    Data on the effectiveness of strategies for the recruitment of American Indians (AIs) into research is needed. This study describes and compares methods for identifying and recruiting AI tobacco users into a pilot study. Community-based strategies were used to recruit smokers (n = 35), e-cigarette users (n = 28), and dual users (n = 32) of AI descent. Recruitment was considered proactive if study staff contacted the individual at a pow wow, health fair, or vape shop and participation on-site or reactive if the individual contacted the study staff and participation occurred later. Screened, eligible, participated and costs and time spent were compared with Chi square tests. To understand AI descent, the relationship between number of AI grandparents and AI blood quantum was examined. Number of participants screened via the proactive strategy was similar to the reactive strategy (n = 84 vs. n = 82; p-value = 0.8766). A significantly greater proportion of individuals screened via the proactive than the reactive strategy were eligible (77 vs. 50%; p-value = 0.0002) and participated (75 vs. 39%; p-value = < 0.0001). Per participant cost and time estimated for the proactive strategy was $89 and 87 min compared to $79 and 56 min for the reactive strategy. Proportion at least half AI blood quantum was 32, 33, and 70% among those with 2, 3, and 4 AI grandparents, respectively (p = 0.0017). Proactive strategies resulted in two-thirds of the sample, but required more resources than reactive strategies. Overall, we found both strategies were feasible and resulted in the ability to reach sample goals. Lastly, number of AI biological grandparents may be a good, non-invasive indicator of AI blood quantum.

  8. Manage Your Cash

    ERIC Educational Resources Information Center

    Matthews, Kenneth M.

    1976-01-01

    Discusses formulas for planning school district investment and borrowing strategies based on a district's predicted cash flow and presents a sample investment/borrowing schedule developed from hypothetical cash-flow data. (JG)

  9. Invited commentary: recruiting for epidemiologic studies using social media.

    PubMed

    Allsworth, Jenifer E

    2015-05-15

    Social media-based recruitment for epidemiologic studies has the potential to expand the demographic and geographic reach of investigators and identify potential participants more cost-effectively than traditional approaches. In fact, social media are particularly appealing for their ability to engage traditionally "hard-to-reach" populations, including young adults and low-income populations. Despite their great promise as a tool for epidemiologists, social media-based recruitment approaches do not currently compare favorably with gold-standard probability-based sampling approaches. Sparse data on the demographic characteristics of social media users, patterns of social media use, and appropriate sampling frames limit our ability to implement probability-based sampling strategies. In a well-conducted study, Harris et al. (Am J Epidemiol. 2015;181(10):737-746) examined the cost-effectiveness of social media-based recruitment (advertisements and promotion) in the Contraceptive Use, Pregnancy Intention, and Decisions (CUPID) Study, a cohort study of 3,799 young adult Australian women, and the approximate representativeness of the CUPID cohort. Implications for social media-based recruitment strategies for cohort assembly, data accuracy, implementation, and human subjects concerns are discussed. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. A novel strategy for rapid quantification of 20(S)-protopanaxatriol and 20(S)-protopanaxadiol saponins in Panax notoginseng P. ginseng and P. quinquefolium.

    PubMed

    Xu, Fa-Xiang; Yuan, Cen; Wan, Jian-Bo; Yan, Ru; Hu, Hao; Li, Shao-Ping; Zhang, Qing-Wen

    2015-01-01

    A novel strategy for the qualitative and quantitative determination of 20(S)-protopanaxatriol saponins (PTS) and 20(S)-protopanaxadiol saponins (PDS) in Panax notoginseng, Panax ginseng and Panax quinquefolium, based on the overlapping peaks of main components of PTS (calibrated by ginsenoside Rg1) and PDS (calibrated by ginsenoside Rb1), was proposed. The analysis was performed by using high-performance liquid chromatography coupled with evaporative light scattering detection (HPLC-ELSD). Under specific chromatographic conditions, all samples showed two overlapping peaks containing several main ginsenosides belonging to PTS and PDS, respectively. The overlapping peaks were also identified by using HPLC-MS. Based on the sum and ratio of PTS and PDS, 60 tested Panax samples were divided into three main clusters according to their species. The findings suggested that this strategy provides a simple and rapid approach to quantify PTS and PDS in Panax herbs.

  11. Lithography hotspot discovery at 70nm DRAM 300mm fab: process window qualification using design base binning

    NASA Astrophysics Data System (ADS)

    Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh

    2008-11-01

    Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.

  12. Comparison of tablet-based strategies for incision planning in laser microsurgery

    NASA Astrophysics Data System (ADS)

    Schoob, Andreas; Lekon, Stefan; Kundrat, Dennis; Kahrs, Lüder A.; Mattos, Leonardo S.; Ortmaier, Tobias

    2015-03-01

    Recent research has revealed that incision planning in laser surgery deploying stylus and tablet outperforms state-of-the-art micro-manipulator-based laser control. Providing more detailed quantitation regarding that approach, a comparative study of six tablet-based strategies for laser path planning is presented. Reference strategy is defined by monoscopic visualization and continuous path drawing on a graphics tablet. Further concepts deploying stereoscopic or a synthesized laser view, point-based path definition, real-time teleoperation or a pen display are compared with the reference scenario. Volunteers were asked to redraw and ablate stamped lines on a sample. Performance is assessed by measuring planning accuracy, completion time and ease of use. Results demonstrate that significant differences exist between proposed concepts. The reference strategy provides more accurate incision planning than the stereo or laser view scenario. Real-time teleoperation performs best with respect to completion time without indicating any significant deviation in accuracy and usability. Point-based planning as well as the pen display provide most accurate planning and increased ease of use compared to the reference strategy. As a result, combining the pen display approach with point-based planning has potential to become a powerful strategy because of benefiting from improved hand-eye-coordination on the one hand and from a simple but accurate technique for path definition on the other hand. These findings as well as the overall usability scale indicating high acceptance and consistence of proposed strategies motivate further advanced tablet-based planning in laser microsurgery.

  13. Enhancing Self-Regulation and Approaches to Learning in First-Year College Students: A Narrative-Based Programme Assessed in the Iberian Peninsula

    ERIC Educational Resources Information Center

    Rosario, Pedro; Nunez, Jose C.; Gonzalez-Pienda, Julio; Valle, Antonio; Trigo, Luisa; Guimaraes, Carina

    2010-01-01

    A programme to enhance first-year college students' self-regulated learning strategies is proposed and assessed in two samples, one from the University of Oviedo (Spain) and the other from the University of Minho (Portugal). Each sample was divided into two groups (experimental and control). The Spanish sample comprised 44 students in the…

  14. Probe sampling strategies for traffic monitoring systems based on wireless location technology.

    DOT National Transportation Integrated Search

    2007-01-01

    Transportation agencies have become very interested in traffic monitoring systems based on wireless location technology (WLT) since they offer the potential of collecting travel time data across a wide portion of the road system. Prior tests of WLT-b...

  15. Effects of organic carbon sequestration strategies on soil enzymatic activities

    NASA Astrophysics Data System (ADS)

    Puglisi, E.; Suciu, N.; Botteri, L.; Ferrari, T.; Coppolecchia, D.; Trevisan, M.; Piccolo, A.

    2009-04-01

    Greenhouse gases emissions can be counterbalanced with proper agronomical strategies aimed at sequestering carbon in soils. These strategies must be tested not only for their ability in reducing carbon dioxide emissions, but also for their impact on soil quality: enzymatic activities are related to main soil ecological quality, and can be used as early and sensitive indicators of alteration events. Three different strategies for soil carbon sequestration were studied: minimum tillage, protection of biodegradable organic fraction by compost amendment and oxidative polimerization of soil organic matter catalyzed by biometic porfirins. All strategies were compared with a traditional agricultural management based on tillage and mineral fertilization. Experiments were carried out in three Italian soils from different pedo-climatic regions located respectively in Piacenza, Turin and Naples and cultivated with maize or wheat. Soil samples were taken for three consecutive years after harvest and analyzed for their content in phosphates, ß-glucosidase, urease and invertase. An alteration index based on these enzymatic activities levels was applied as well. The biomimetic porfirin application didn't cause changes in enzymatic activities compared to the control at any treatment or location. Enzymatic activities were generally higher in the minimum tillage and compost treatment, while differences between location and date of samplings were limited. Application of the soil alteration index based on enzymatic activities showed that soils treated with compost or subjected to minimum tillage generally have a higher biological quality. The work confirms the environmental sustainability of the carbon sequestering agronomical practices studied.

  16. Integrated strategy for identifying minor components in complex samples combining mass defect, diagnostic ions and neutral loss information based on ultra-performance liquid chromatography-high resolution mass spectrometry platform: Folium Artemisiae Argyi as a case study.

    PubMed

    Ren, Dabing; Ran, Lu; Yang, Chong; Xu, Meilin; Yi, Lunzhao

    2018-05-18

    Ultra-performance liquid chromatography coupled to high-resolution mass spectrometry (UPLC-HRMS) has been used as a powerful tool to profile chemicals in traditional Chinese medicines. However, identification of potentially bioactive compounds is still a challenging work because of the large amount of information contained in the raw UPLC-HRMS data. Especially the ubiquitous matrix interference makes it more difficult to characterize the minor components. Therefore, rapid recognition and efficient extraction of the corresponding parent ions is critically important for identifying the attractive compounds in complex samples. Herein, we propose an integrated filtering strategy to remove un-related or interference MS 1 ions from the raw UPLC-HRMS data, which helps to retain the MS features of the target components and expose the compounds of interest as effective as possible. The proposed strategy is based on the use of a combination of different filtering methods, including nitrogen rule, mass defect, and neutral loss/diagnostic fragment ions filtering. The strategy was validated by rapid screening and identification of 16 methoxylated flavonoids and 55 chlorogenic acids analogues from the raw UPLC-HRMS dataset of Folium Artemisiae Argyi. Particularly, successful detection of several minor components indicated that the integrated strategy has obvious advantages over individual filtering methods, and it can be used as a promising method for screening and identifying compounds from complex samples, such as herbal medicines. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. A Fosmid Cloning Strategy for Detecting the Widest Possible Spectrum of Microbes from the International Space Station Drinking Water System

    PubMed Central

    Choi, Sangdun; Chang, Mi Sook; Stuecker, Tara; Chung, Christine; Newcombe, David A.; Venkateswaran, Kasthuri

    2012-01-01

    In this study, fosmid cloning strategies were used to assess the microbial populations in water from the International Space Station (ISS) drinking water system (henceforth referred to as Prebiocide and Tank A water samples). The goals of this study were: to compare the sensitivity of the fosmid cloning strategy with that of traditional culture-based and 16S rRNA-based approaches and to detect the widest possible spectrum of microbial populations during the water purification process. Initially, microbes could not be cultivated, and conventional PCR failed to amplify 16S rDNA fragments from these low biomass samples. Therefore, randomly primed rolling-circle amplification was used to amplify any DNA that might be present in the samples, followed by size selection by using pulsed-field gel electrophoresis. The amplified high-molecular-weight DNA from both samples was cloned into fosmid vectors. Several hundred clones were randomly selected for sequencing, followed by Blastn/Blastx searches. Sequences encoding specific genes from Burkholderia, a species abundant in the soil and groundwater, were found in both samples. Bradyrhizobium and Mesorhizobium, which belong to rhizobia, a large community of nitrogen fixers often found in association with plant roots, were present in the Prebiocide samples. Ralstonia, which is prevalent in soils with a high heavy metal content, was detected in the Tank A samples. The detection of many unidentified sequences suggests the presence of potentially novel microbial fingerprints. The bacterial diversity detected in this pilot study using a fosmid vector approach was higher than that detected by conventional 16S rRNA gene sequencing. PMID:23346038

  18. Development of a sampling strategy and sample size calculation to estimate the distribution of mammographic breast density in Korean women.

    PubMed

    Jun, Jae Kwan; Kim, Mi Jin; Choi, Kui Son; Suh, Mina; Jung, Kyu-Won

    2012-01-01

    Mammographic breast density is a known risk factor for breast cancer. To conduct a survey to estimate the distribution of mammographic breast density in Korean women, appropriate sampling strategies for representative and efficient sampling design were evaluated through simulation. Using the target population from the National Cancer Screening Programme (NCSP) for breast cancer in 2009, we verified the distribution estimate by repeating the simulation 1,000 times using stratified random sampling to investigate the distribution of breast density of 1,340,362 women. According to the simulation results, using a sampling design stratifying the nation into three groups (metropolitan, urban, and rural), with a total sample size of 4,000, we estimated the distribution of breast density in Korean women at a level of 0.01% tolerance. Based on the results of our study, a nationwide survey for estimating the distribution of mammographic breast density among Korean women can be conducted efficiently.

  19. The price elasticity of demand for heroin: matched longitudinal and experimental evidence#

    PubMed Central

    Olmstead, Todd A.; Alessi, Sheila M.; Kline, Brendan; Pacula, Rosalie Liccardo; Petry, Nancy M.

    2015-01-01

    This paper reports estimates of the price elasticity of demand for heroin based on a newly constructed dataset. The dataset has two matched components concerning the same sample of regular heroin users: longitudinal information about real-world heroin demand (actual price and actual quantity at daily intervals for each heroin user in the sample) and experimental information about laboratory heroin demand (elicited by presenting the same heroin users with scenarios in a laboratory setting). Two empirical strategies are used to estimate the price elasticity of demand for heroin. The first strategy exploits the idiosyncratic variation in the price experienced by a heroin user over time that occurs in markets for illegal drugs. The second strategy exploits the experimentally-induced variation in price experienced by a heroin user across experimental scenarios. Both empirical strategies result in the estimate that the conditional price elasticity of demand for heroin is approximately −0.80. PMID:25702687

  20. Pharmacokinetic Studies in Neonates: The Utility of an Opportunistic Sampling Design.

    PubMed

    Leroux, Stéphanie; Turner, Mark A; Guellec, Chantal Barin-Le; Hill, Helen; van den Anker, Johannes N; Kearns, Gregory L; Jacqz-Aigrain, Evelyne; Zhao, Wei

    2015-12-01

    The use of an opportunistic (also called scavenged) sampling strategy in a prospective pharmacokinetic study combined with population pharmacokinetic modelling has been proposed as an alternative strategy to conventional methods for accomplishing pharmacokinetic studies in neonates. However, the reliability of this approach in this particular paediatric population has not been evaluated. The objective of the present study was to evaluate the performance of an opportunistic sampling strategy for a population pharmacokinetic estimation, as well as dose prediction, and compare this strategy with a predetermined pharmacokinetic sampling approach. Three population pharmacokinetic models were derived for ciprofloxacin from opportunistic blood samples (SC model), predetermined (i.e. scheduled) samples (TR model) and all samples (full model used to previously characterize ciprofloxacin pharmacokinetics), using NONMEM software. The predictive performance of developed models was evaluated in an independent group of patients. Pharmacokinetic data from 60 newborns were obtained with a total of 430 samples available for analysis; 265 collected at predetermined times and 165 that were scavenged from those obtained as part of clinical care. All datasets were fit using a two-compartment model with first-order elimination. The SC model could identify the most significant covariates and provided reasonable estimates of population pharmacokinetic parameters (clearance and steady-state volume of distribution) compared with the TR and full models. Their predictive performances were further confirmed in an external validation by Bayesian estimation, and showed similar results. Monte Carlo simulation based on area under the concentration-time curve from zero to 24 h (AUC24)/minimum inhibitory concentration (MIC) using either the SC or the TR model gave similar dose prediction for ciprofloxacin. Blood samples scavenged in the course of caring for neonates can be used to estimate ciprofloxacin pharmacokinetic parameters and therapeutic dose requirements.

  1. Chi-Squared Test of Fit and Sample Size-A Comparison between a Random Sample Approach and a Chi-Square Value Adjustment Method.

    PubMed

    Bergh, Daniel

    2015-01-01

    Chi-square statistics are commonly used for tests of fit of measurement models. Chi-square is also sensitive to sample size, which is why several approaches to handle large samples in test of fit analysis have been developed. One strategy to handle the sample size problem may be to adjust the sample size in the analysis of fit. An alternative is to adopt a random sample approach. The purpose of this study was to analyze and to compare these two strategies using simulated data. Given an original sample size of 21,000, for reductions of sample sizes down to the order of 5,000 the adjusted sample size function works as good as the random sample approach. In contrast, when applying adjustments to sample sizes of lower order the adjustment function is less effective at approximating the chi-square value for an actual random sample of the relevant size. Hence, the fit is exaggerated and misfit under-estimated using the adjusted sample size function. Although there are big differences in chi-square values between the two approaches at lower sample sizes, the inferences based on the p-values may be the same.

  2. High-resolution monitoring of marine protists based on an observation strategy integrating automated on-board filtration and molecular analyses

    NASA Astrophysics Data System (ADS)

    Metfies, Katja; Schroeder, Friedhelm; Hessel, Johanna; Wollschläger, Jochen; Micheller, Sebastian; Wolf, Christian; Kilias, Estelle; Sprong, Pim; Neuhaus, Stefan; Frickenhaus, Stephan; Petersen, Wilhelm

    2016-11-01

    Information on recent biomass distribution and biogeography of photosynthetic marine protists with adequate temporal and spatial resolution is urgently needed to better understand the consequences of environmental change for marine ecosystems. Here we introduce and review a molecular-based observation strategy for high-resolution assessment of these protists in space and time. It is the result of extensive technology developments, adaptations and evaluations which are documented in a number of different publications, and the results of the recently completed field testing which are introduced in this paper. The observation strategy is organized at four different levels. At level 1, samples are collected at high spatiotemporal resolution using the remotely controlled automated filtration system AUTOFIM. Resulting samples can either be preserved for later laboratory analyses, or directly subjected to molecular surveillance of key species aboard the ship via an automated biosensor system or quantitative polymerase chain reaction (level 2). Preserved samples are analyzed at the next observational levels in the laboratory (levels 3 and 4). At level 3 this involves molecular fingerprinting methods for a quick and reliable overview of differences in protist community composition. Finally, selected samples can be used to generate a detailed analysis of taxonomic protist composition via the latest next generation sequencing technology (NGS) at level 4. An overall integrated dataset of the results based on the different analyses provides comprehensive information on the diversity and biogeography of protists, including all related size classes. At the same time the cost of the observation is optimized with respect to analysis effort and time.

  3. Bare-Bones Teaching-Learning-Based Optimization

    PubMed Central

    Zou, Feng; Wang, Lei; Hei, Xinhong; Chen, Debao; Jiang, Qiaoyong; Li, Hongye

    2014-01-01

    Teaching-learning-based optimization (TLBO) algorithm which simulates the teaching-learning process of the class room is one of the recently proposed swarm intelligent (SI) algorithms. In this paper, a new TLBO variant called bare-bones teaching-learning-based optimization (BBTLBO) is presented to solve the global optimization problems. In this method, each learner of teacher phase employs an interactive learning strategy, which is the hybridization of the learning strategy of teacher phase in the standard TLBO and Gaussian sampling learning based on neighborhood search, and each learner of learner phase employs the learning strategy of learner phase in the standard TLBO or the new neighborhood search strategy. To verify the performance of our approaches, 20 benchmark functions and two real-world problems are utilized. Conducted experiments can been observed that the BBTLBO performs significantly better than, or at least comparable to, TLBO and some existing bare-bones algorithms. The results indicate that the proposed algorithm is competitive to some other optimization algorithms. PMID:25013844

  4. Bare-bones teaching-learning-based optimization.

    PubMed

    Zou, Feng; Wang, Lei; Hei, Xinhong; Chen, Debao; Jiang, Qiaoyong; Li, Hongye

    2014-01-01

    Teaching-learning-based optimization (TLBO) algorithm which simulates the teaching-learning process of the class room is one of the recently proposed swarm intelligent (SI) algorithms. In this paper, a new TLBO variant called bare-bones teaching-learning-based optimization (BBTLBO) is presented to solve the global optimization problems. In this method, each learner of teacher phase employs an interactive learning strategy, which is the hybridization of the learning strategy of teacher phase in the standard TLBO and Gaussian sampling learning based on neighborhood search, and each learner of learner phase employs the learning strategy of learner phase in the standard TLBO or the new neighborhood search strategy. To verify the performance of our approaches, 20 benchmark functions and two real-world problems are utilized. Conducted experiments can been observed that the BBTLBO performs significantly better than, or at least comparable to, TLBO and some existing bare-bones algorithms. The results indicate that the proposed algorithm is competitive to some other optimization algorithms.

  5. Weight-Based Victimization among Adolescents in the School Setting: Emotional Reactions and Coping Behaviors

    ERIC Educational Resources Information Center

    Puhl, Rebecca M.; Luedicke, Joerg

    2012-01-01

    Weight-based victimization is a frequent experience for adolescents, but little is known about their emotional reactions and coping strategies in response to weight-based teasing and bullying. The present study examined the ways that adolescents cope with experiences of weight-based victimization at school. An initial sample of 1,555 students from…

  6. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    DOE PAGES

    Jakeman, J. D.; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity. We show that utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchical surplus based strategies. Throughout this papermore » we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.« less

  7. Community-Based Adaptation To A Changing Climate

    EPA Pesticide Factsheets

    This resource discusses how climate change is affecting community services, presents sample adaptation strategies, gives examples of successful community adaptation actions, and provides links to other key federal resources.

  8. Teachers' Improvisation of Instructional Materials for Nigerian Home Economics Curriculum Delivery: Challenges and Strategies

    ERIC Educational Resources Information Center

    Olibie, Eyiuche Ifeoma; Nwabunwanne, Chinyere; Ezenwanne, Dorothy Nkem

    2013-01-01

    This study was designed to ascertain the challenges of improvising instructional materials by Home Economics teachers at the Upper Basic education level in Nigeria, and as a result identify strategies for enhancing improvisation. The study used survey research design based on two research questions. The sample was four hundred and thirty-one Home…

  9. Canadian Families' Strategies for Employment and Care for Preschool Children

    ERIC Educational Resources Information Center

    Ornstein, Michael; Stalker, Glenn J.

    2013-01-01

    Based on the 2006 Canadian Census "long form" sample of one in every five households, the authors develop a detailed typology of family strategies for employment and the care of preschool children. The analysis is restricted to opposite-sex couples with at least one child under age 6 and no older child or other adult in the household.…

  10. Soil sampling strategies: evaluation of different approaches.

    PubMed

    de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Mufato, Renzo; Sartori, Giuseppe; Stocchero, Giulia

    2008-11-01

    The National Environmental Protection Agency of Italy (APAT) performed a soil sampling intercomparison, inviting 14 regional agencies to test their own soil sampling strategies. The intercomparison was carried out at a reference site, previously characterised for metal mass fraction distribution. A wide range of sampling strategies, in terms of sampling patterns, type and number of samples collected, were used to assess the mean mass fraction values of some selected elements. The different strategies led in general to acceptable bias values (D) less than 2sigma, calculated according to ISO 13258. Sampling on arable land was relatively easy, with comparable results between different sampling strategies.

  11. Autism treatment survey: services received by children with autism spectrum disorders in public school classrooms.

    PubMed

    Hess, Kristen L; Morrier, Michael J; Heflin, L Juane; Ivey, Michelle L

    2008-05-01

    The Autism Treatment Survey was developed to identify strategies used in education of children with autism spectrum disorders (ASD) in Georgia. Respondents of the web-based survey included a representative sample of 185 teachers across the state, reporting on 226 children with ASD in grades preschool-12th. The top five strategies being used in Georgia (Gentle Teaching, sensory integration, cognitive behavioral modification, assistive technology, and Social Stories) are recognized as lacking a scientific basis for implementation. Analysis revealed the choice of strategies varied by grade level and classroom type (e.g., general education, special education). Results highlight clear implications for preservice and inservice educator training, and the need for continued research to document evidence-based strategy use in public schools for students with ASD.

  12. Cost-effectiveness of diagnostic strategies using quantitative real-time PCR and bacterial culture to identify contagious mastitis cases in large dairy herds.

    PubMed

    Murai, Kiyokazu; Lehenbauer, Terry W; Champagne, John D; Glenn, Kathy; Aly, Sharif S

    2014-03-01

    Diagnostic strategies to detect contagious mastitis caused by Mycoplasma bovis, Staphylococcus aureus, and Streptococcus agalactiae in dairy herds during an outbreak have been minimally studied with regard to cost and diagnostic sensitivity. The objective of this cross-sectional study was to compare the cost-effectiveness of diagnostic strategies for identification of infected cows in two California dairy herds during contagious mastitis outbreaks. M. bovis was investigated in a subset of a herd (n=1210 cows) with an estimated prevalence of 2.8% (95% CI=1.9, 3.7), whereas Staph. aureus and Strep. agalactiae were studied in a second herd (n=351 cows) with an estimated prevalence of 3.4% (95% CI=1.5, 5.3) and 16.8% (95% CI=12.9, 20.7), respectively. Diagnostic strategies involved a combination of testing stages that utilized bacterial culture, quantitative real-time PCR (qPCR), or both. Strategies were applied to individual or pooled samples of 5, 10, 50 or 100 samples. Culture was considered the gold standard for sensitivity estimation of each strategy. The reference strategy was the strategy with the lowest cost per culture-positive cow which for both M. bovis and Strep. agalactiae consisted of 2 stages, culture of samples in pools of 5 followed by culture of individual samples in positive pools with a sensitivity of 73.5% (95% CI: 55.6, 87.1) and 96.6% (95% CI: 27.7, 84.8), respectively. The reference strategy for Staph. aureus consisted of 3 stages, culture of individual samples in pools of 100 (stage 1), culture constituents of those positive from stage 1 in pools of 5 (stage 2), culture constituents of those positive from stage 2 individually (stage 3) which resulted in a sensitivity of 58.3% (95% CI: 88.3, 99.6). The most cost-effective alternative to the reference strategy was whole herd milk culture for all 3 pathogens. QPCR testing was a component of the second most cost-effective alternative for M. bovis and the third most cost-effective alternatives for the 3 pathogens. A stochastic model was used to assess the effect of prevalence or herd size on the cost-effectiveness of diagnostic strategies. In the current study, increasing the prevalence of mastitis did not alter the ranking of strategies by cost-effectiveness. However, larger herds could benefit from testing larger pools such as 50 or 100 samples to improve cost-effectiveness. Several diagnostic strategy options exist to identify contagious mastitis in herds, decisions should be based on cost and sensitivity of the strategies available. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Coping with matrix effects in headspace solid phase microextraction gas chromatography using multivariate calibration strategies.

    PubMed

    Ferreira, Vicente; Herrero, Paula; Zapata, Julián; Escudero, Ana

    2015-08-14

    SPME is extremely sensitive to experimental parameters affecting liquid-gas and gas-solid distribution coefficients. Our aims were to measure the weights of these factors and to design a multivariate strategy based on the addition of a pool of internal standards, to minimize matrix effects. Synthetic but real-like wines containing selected analytes and variable amounts of ethanol, non-volatile constituents and major volatile compounds were prepared following a factorial design. The ANOVA study revealed that even using a strong matrix dilution, matrix effects are important and additive with non-significant interaction effects and that it is the presence of major volatile constituents the most dominant factor. A single internal standard provided a robust calibration for 15 out of 47 analytes. Then, two different multivariate calibration strategies based on Partial Least Square Regression were run in order to build calibration functions based on 13 different internal standards able to cope with matrix effects. The first one is based in the calculation of Multivariate Internal Standards (MIS), linear combinations of the normalized signals of the 13 internal standards, which provide the expected area of a given unit of analyte present in each sample. The second strategy is a direct calibration relating concentration to the 13 relative areas measured in each sample for each analyte. Overall, 47 different compounds can be reliably quantified in a single fully automated method with overall uncertainties better than 15%. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Innovative Recruitment Using Online Networks: Lessons Learned From an Online Study of Alcohol and Other Drug Use Utilizing a Web-Based, Respondent-Driven Sampling (webRDS) Strategy

    PubMed Central

    Bauermeister, José A.; Zimmerman, Marc A.; Johns, Michelle M.; Glowacki, Pietreck; Stoddard, Sarah; Volz, Erik

    2012-01-01

    Objective: We used a web version of Respondent-Driven Sampling (webRDS) to recruit a sample of young adults (ages 18–24) and examined whether this strategy would result in alcohol and other drug (AOD) prevalence estimates comparable to national estimates (National Survey on Drug Use and Health [NSDUH]). Method: We recruited 22 initial participants (seeds) via Facebook to complete a web survey examining AOD risk correlates. Sequential, incentivized recruitment continued until our desired sample size was achieved. After correcting for webRDS clustering effects, we contrasted our AOD prevalence estimates (past 30 days) to NSDUH estimates by comparing the 95% confidence intervals of prevalence estimates. Results: We found comparable AOD prevalence estimates between our sample and NSDUH for the past 30 days for alcohol, marijuana, cocaine, Ecstasy (3,4-methylenedioxymethamphetamine, or MDMA), and hallucinogens. Cigarette use was lower than NSDUH estimates. Conclusions: WebRDS may be a suitable strategy to recruit young adults online. We discuss the unique strengths and challenges that may be encountered by public health researchers using webRDS methods. PMID:22846248

  15. Recruiting Young Adult Cancer Survivors for Behavioral Research

    PubMed Central

    Horowitz, Santina; Marcus, Bess

    2012-01-01

    Young adults have been dramatically underrepresented in cancer survivorship research. One contributing factor is the difficulty recruiting this population. To identify effective recruitment strategies, the current study assessed the yield of strategies used to recruit young survivors for an exercise intervention including: clinic-based recruitment, recruitment at cancer-related events, mailings, telephone-based recruitment, advertising on the internet, radio, television and social networking media, distributing brochures and word-of-mouth referrals. When taking into account the strategies for which we could track the number of survivors approached, recruitment at an oncology clinic was the most productive: 38 % of those approached were screened and 8 % enrolled. When evaluating which strategy yielded the greatest percentage of the sample, however, mailings were the most productive. Given widespread use of the internet and social networking by young adults, investigators should also consider these low-cost recruitment strategies. PMID:22810954

  16. Recruiting young adult cancer survivors for behavioral research.

    PubMed

    Rabin, Carolyn; Horowitz, Santina; Marcus, Bess

    2013-03-01

    Young adults have been dramatically underrepresented in cancer survivorship research. One contributing factor is the difficulty recruiting this population. To identify effective recruitment strategies, the current study assessed the yield of strategies used to recruit young survivors for an exercise intervention including: clinic-based recruitment, recruitment at cancer-related events, mailings, telephone-based recruitment, advertising on the internet, radio, television and social networking media, distributing brochures and word-of-mouth referrals. When taking into account the strategies for which we could track the number of survivors approached, recruitment at an oncology clinic was the most productive: 38 % of those approached were screened and 8 % enrolled. When evaluating which strategy yielded the greatest percentage of the sample, however, mailings were the most productive. Given widespread use of the internet and social networking by young adults, investigators should also consider these low-cost recruitment strategies.

  17. Detection of Toxoplasma gondii oocysts in water: proposition of a strategy and evaluation in Champagne-Ardenne Region, France.

    PubMed

    Aubert, D; Villena, I

    2009-03-01

    Water is a vehicle for disseminating human and veterinary toxoplasmosis due to oocyst contamination. Several outbreaks of toxoplasmosis throughout the world have been related to contaminated drinking water. We have developed a method for the detection of Toxoplasma gondii oocysts in water and we propose a strategy for the detection of multiple waterborne parasites, including Cryptosporidium spp. and Giardia. Water samples were filtered to recover Toxoplasma oocysts and, after the detection of Cryptosporidium oocysts and Giardia cysts by immunofluorescence, as recommended by French norm procedure NF T 90-455, the samples were purified on a sucrose density gradient. Detection of Toxoplasma was based on PCR amplification and mouse inoculation to determine the presence and infectivity of recovered oocysts. After experimental seeding assays, we determined that the PCR assay was more sensitive than the bioassay. This strategy was then applied to 482 environmental water samples collected since 2001. We detected Toxoplasma DNA in 37 environmental samples (7.7%), including public drinking water; however, none of them were positive by bioassay. This strategy efficiently detects Toxoplasma oocysts in water and may be suitable as a public health sentinel method. Alternative methods can be used in conjunction with this one to determine the infectivity of parasites that were detected by molecular methods.

  18. Sampling Of SAR Imagery For Wind Resource Assesment

    NASA Astrophysics Data System (ADS)

    Badger, Merete; Badger, Jake; Hasager, Charlotte; Nielsen, Morten

    2010-04-01

    Wind resources over the sea can be assessed from a series of wind fields retrieved from Envisat ASAR imagery, or other SAR data. Previous wind resource maps have been produced through random sampling of 70 or more satellite scenes over a given area of interest followed by fitting of a Weibull function to the data. Here we introduce a more advanced sampling strategy based on wind class methodology that is normally applied in Risø DTU’s numerical modeling of wind resources. The aim is to obtain a more representative data set using fewer satellite SAR scenes. The new sampling strategy has been applied within a wind and solar resource assessment study for the United Arab Emirates (UAE) and also for wind resource mapping over a domain in the North Sea, as part of the EU- NORSEWInD project (2008-2012).

  19. Portable, universal, and visual ion sensing platform based on the light emitting diode-based self-referencing-ion selective field-effect transistor.

    PubMed

    Zhang, Xiaowei; Han, Yanchao; Li, Jing; Zhang, Libing; Jia, Xiaofang; Wang, Erkang

    2014-02-04

    In this work, a novel and universal ion sensing platform was presented, which enables the visual detection of various ions with high sensitivity and selectivity. Coaxial potential signals (millivolt-scale) of the sample from the self-referencing (SR) ion selective chip can be transferred into the ad620-based amplifier with an output of volt-scale potentials. The amplified voltage is high enough to drive a light emitting diode (LED), which can be used as an amplifier and indicator to report the sample information. With this double amplification device (light emitting diode-based self-referencing-ion selective field-effect transistor, LED-SR-ISFET), a tiny change of the sample concentration can be observed with a distinguishable variation of LED brightness by visual inspection. This LED-based luminescent platform provided a facile, low-cost, and rapid sensing strategy without the need of additional expensive chemiluminescence reagent and instruments. Moreover, the SR mode also endows this device excellent stability and reliability. With this innovative design, sensitive determination of K(+), H(+), and Cl(-) by the naked eye was achieved. It should also be noticed that this sensing strategy can easily be extended to other ions (or molecules) by simply integrating the corresponding ion (or molecule) selective electrode.

  20. Comparative analysis of vaginal microbiota sampling using 16S rRNA gene analysis.

    PubMed

    Virtanen, Seppo; Kalliala, Ilkka; Nieminen, Pekka; Salonen, Anne

    2017-01-01

    Molecular methods such as next-generation sequencing are actively being employed to characterize the vaginal microbiota in health and disease. Previous studies have focused on characterizing the biological variation in the microbiota, and less is known about how factors related to sampling contribute to the results. Our aim was to investigate the impact of a sampling device and anatomical sampling site on the quantitative and qualitative outcomes relevant for vaginal microbiota research. We sampled 10 Finnish women representing diverse clinical characteristics with flocked swabs, the Evalyn® self-sampling device, sterile plastic spatulas and a cervical brush that were used to collect samples from fornix, vaginal wall and cervix. Samples were compared on DNA and protein yield, bacterial load, and microbiota diversity and species composition based on Illumina MiSeq sequencing of the 16S rRNA gene. We quantified the relative contributions of sampling variables versus intrinsic variables in the overall microbiota variation, and evaluated the microbiota profiles using several commonly employed metrics such as alpha and beta diversity as well as abundance of major bacterial genera and species. The total DNA yield was strongly dependent on the sampling device and to a lesser extent on the anatomical site of sampling. The sampling strategy did not affect the protein yield or the bacterial load. All tested sampling methods produced highly comparable microbiota profiles based on MiSeq sequencing. The sampling method explained only 2% (p-value = 0.89) of the overall microbiota variation, markedly surpassed by intrinsic factors such as clinical status (microscopy for bacterial vaginosis 53%, p = 0.0001), bleeding (19%, p = 0.0001), and the variation between subjects (11%, p-value 0.0001). The results indicate that different sampling strategies yield comparable vaginal microbiota composition and diversity. Hence, past and future vaginal microbiota studies employing different sampling strategies should be comparable in the absence of other technical confounders. The Evalyn® self-sampling device performed equally well compared to samples taken by a clinician, and hence offers a good-quality microbiota sample without the need for a gynecological examination. The amount of collected sample as well as the DNA and protein yield varied across the sampling techniques, which may have practical implications for study design.

  1. Comparative analysis of vaginal microbiota sampling using 16S rRNA gene analysis

    PubMed Central

    Kalliala, Ilkka; Nieminen, Pekka; Salonen, Anne

    2017-01-01

    Background Molecular methods such as next-generation sequencing are actively being employed to characterize the vaginal microbiota in health and disease. Previous studies have focused on characterizing the biological variation in the microbiota, and less is known about how factors related to sampling contribute to the results. Our aim was to investigate the impact of a sampling device and anatomical sampling site on the quantitative and qualitative outcomes relevant for vaginal microbiota research. We sampled 10 Finnish women representing diverse clinical characteristics with flocked swabs, the Evalyn® self-sampling device, sterile plastic spatulas and a cervical brush that were used to collect samples from fornix, vaginal wall and cervix. Samples were compared on DNA and protein yield, bacterial load, and microbiota diversity and species composition based on Illumina MiSeq sequencing of the 16S rRNA gene. We quantified the relative contributions of sampling variables versus intrinsic variables in the overall microbiota variation, and evaluated the microbiota profiles using several commonly employed metrics such as alpha and beta diversity as well as abundance of major bacterial genera and species. Results The total DNA yield was strongly dependent on the sampling device and to a lesser extent on the anatomical site of sampling. The sampling strategy did not affect the protein yield or the bacterial load. All tested sampling methods produced highly comparable microbiota profiles based on MiSeq sequencing. The sampling method explained only 2% (p-value = 0.89) of the overall microbiota variation, markedly surpassed by intrinsic factors such as clinical status (microscopy for bacterial vaginosis 53%, p = 0.0001), bleeding (19%, p = 0.0001), and the variation between subjects (11%, p-value 0.0001). Conclusions The results indicate that different sampling strategies yield comparable vaginal microbiota composition and diversity. Hence, past and future vaginal microbiota studies employing different sampling strategies should be comparable in the absence of other technical confounders. The Evalyn® self-sampling device performed equally well compared to samples taken by a clinician, and hence offers a good-quality microbiota sample without the need for a gynecological examination. The amount of collected sample as well as the DNA and protein yield varied across the sampling techniques, which may have practical implications for study design. PMID:28723942

  2. Flow Cytometry: Impact on Early Drug Discovery.

    PubMed

    Edwards, Bruce S; Sklar, Larry A

    2015-07-01

    Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens of thousands of cells per second and more than five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, "sip-and-spit" sampling technology has restricted it to low-sample-throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens of thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multiparameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage, and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry, and parallel sample processing promise dramatically expanded single-cell profiling capabilities to bolster systems-level approaches to drug discovery. © 2015 Society for Laboratory Automation and Screening.

  3. Flow Cytometry: Impact On Early Drug Discovery

    PubMed Central

    Edwards, Bruce S.; Sklar, Larry A.

    2015-01-01

    Summary Modern flow cytometers can make optical measurements of 10 or more parameters per cell at tens-of-thousands of cells per second and over five orders of magnitude dynamic range. Although flow cytometry is used in most drug discovery stages, “sip-and-spit” sampling technology has restricted it to low sample throughput applications. The advent of HyperCyt sampling technology has recently made possible primary screening applications in which tens-of-thousands of compounds are analyzed per day. Target-multiplexing methodologies in combination with extended multi-parameter analyses enable profiling of lead candidates early in the discovery process, when the greatest numbers of candidates are available for evaluation. The ability to sample small volumes with negligible waste reduces reagent costs, compound usage and consumption of cells. Improved compound library formatting strategies can further extend primary screening opportunities when samples are scarce. Dozens of targets have been screened in 384- and 1536-well assay formats, predominantly in academic screening lab settings. In concert with commercial platform evolution and trending drug discovery strategies, HyperCyt-based systems are now finding their way into mainstream screening labs. Recent advances in flow-based imaging, mass spectrometry and parallel sample processing promise dramatically expanded single cell profiling capabilities to bolster systems level approaches to drug discovery. PMID:25805180

  4. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE PAGES

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...

    2017-12-27

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  5. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  6. Program Evaluation of Outcomes Based Orthotic and Prosthetic Education

    DTIC Science & Technology

    2007-12-01

    writing of new Biomechanics and Gait curricula and syllabi; implementation of online learning strategies through the use of Blackboard online learning...TERMS Prosthetics, Orthotics, Prosthesis , Orthosis, Amputation, Outcomes Based Evaluation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...Appendix III. Appendix IV. Sample Syllabus – Biomechanics …………………………… 24 Appendix V. Sample Final Examination……………………………………. 29

  7. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software

    PubMed Central

    Dols, W. Stuart; Persily, Andrew K.; Morrow, Jayne B.; Matzke, Brett D.; Sego, Landon H.; Nuffer, Lisa L.; Pulsipher, Brent A.

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones. PMID:27134782

  8. Development and Demonstration of a Method to Evaluate Bio-Sampling Strategies Using Building Simulation and Sample Planning Software.

    PubMed

    Dols, W Stuart; Persily, Andrew K; Morrow, Jayne B; Matzke, Brett D; Sego, Landon H; Nuffer, Lisa L; Pulsipher, Brent A

    2010-01-01

    In an effort to validate and demonstrate response and recovery sampling approaches and technologies, the U.S. Department of Homeland Security (DHS), along with several other agencies, have simulated a biothreat agent release within a facility at Idaho National Laboratory (INL) on two separate occasions in the fall of 2007 and the fall of 2008. Because these events constitute only two realizations of many possible scenarios, increased understanding of sampling strategies can be obtained by virtually examining a wide variety of release and dispersion scenarios using computer simulations. This research effort demonstrates the use of two software tools, CONTAM, developed by the National Institute of Standards and Technology (NIST), and Visual Sample Plan (VSP), developed by Pacific Northwest National Laboratory (PNNL). The CONTAM modeling software was used to virtually contaminate a model of the INL test building under various release and dissemination scenarios as well as a range of building design and operation parameters. The results of these CONTAM simulations were then used to investigate the relevance and performance of various sampling strategies using VSP. One of the fundamental outcomes of this project was the demonstration of how CONTAM and VSP can be used together to effectively develop sampling plans to support the various stages of response to an airborne chemical, biological, radiological, or nuclear event. Following such an event (or prior to an event), incident details and the conceptual site model could be used to create an ensemble of CONTAM simulations which model contaminant dispersion within a building. These predictions could then be used to identify priority area zones within the building and then sampling designs and strategies could be developed based on those zones.

  9. A community stakeholder analysis of drug resistance strategies of rural native Hawaiian youth.

    PubMed

    Okamoto, Scott K; Helm, Susana; Delp, Justin A; Stone, Kristina; Dinson, Ay-Laina; Stetkiewicz, Jennifer

    2011-08-01

    This study examines and validates the drug resistance strategies identified by rural Hawaiian youth from prior research with a sample of community stakeholders on the Island of Hawai'i. One hundred thirty-eight stakeholders with a vested interest in reducing youth substance use (i.e., teachers, principals, social service agency providers, and older youth) completed a web-based survey comprised of 15 drug-related problem situations and 413 responses developed by Hawaiian youth. The findings corroborated the youth-focused findings from prior research. Differences in the endorsement of different strategies were examined based on gender, ethnicity, and age of the stakeholders. Implications for culturally grounded drug prevention in rural Hawaiian communities are discussed.

  10. Adaptive Sampling for Urban Air Quality through Participatory Sensing

    PubMed Central

    Zeng, Yuanyuan; Xiang, Kai

    2017-01-01

    Air pollution is one of the major problems of the modern world. The popularization and powerful functions of smartphone applications enable people to participate in urban sensing to better know about the air problems surrounding them. Data sampling is one of the most important problems that affect the sensing performance. In this paper, we propose an Adaptive Sampling Scheme for Urban Air Quality (AS-air) through participatory sensing. Firstly, we propose to find the pattern rules of air quality according to the historical data contributed by participants based on Apriori algorithm. Based on it, we predict the on-line air quality and use it to accelerate the learning process to choose and adapt the sampling parameter based on Q-learning. The evaluation results show that AS-air provides an energy-efficient sampling strategy, which is adaptive toward the varied outside air environment with good sampling efficiency. PMID:29099766

  11. Replica Exchange Improves Sampling in Low-Resolution Docking Stage of RosettaDock

    PubMed Central

    Zhang, Zhe; Lange, Oliver F.

    2013-01-01

    Many protein-protein docking protocols are based on a shotgun approach, in which thousands of independent random-start trajectories minimize the rigid-body degrees of freedom. Another strategy is enumerative sampling as used in ZDOCK. Here, we introduce an alternative strategy, ReplicaDock, using a small number of long trajectories of temperature replica exchange. We compare replica exchange sampling as low-resolution stage of RosettaDock with RosettaDock's original shotgun sampling as well as with ZDOCK. A benchmark of 30 complexes starting from structures of the unbound binding partners shows improved performance for ReplicaDock and ZDOCK when compared to shotgun sampling at equal or less computational expense. ReplicaDock and ZDOCK consistently reach lower energies and generate significantly more near-native conformations than shotgun sampling. Accordingly, they both improve typical metrics of prediction quality of complex structures after refinement. Additionally, the refined ReplicaDock ensembles reach significantly lower interface energies and many previously hidden features of the docking energy landscape become visible when ReplicaDock is applied. PMID:24009670

  12. Getting men in the room: perceptions of effective strategies to initiate men's involvement in gender-based violence prevention in a global sample.

    PubMed

    Casey, Erin A; Leek, Cliff; Tolman, Richard M; Allen, Christopher T; Carlson, Juliana M

    2017-09-01

    As engaging men in gender-based violence prevention efforts becomes an increasingly institutionalised component of gender equity work globally, clarity is needed about the strategies that best initiate male-identified individuals' involvement in these efforts. The purpose of this study was to examine the perceived relevance and effectiveness of men's engagement strategies from the perspective of men around the world who have organised or attended gender-based violence prevention events. Participants responded to an online survey (available in English, French and Spanish) and rated the effectiveness of 15 discrete engagement strategies derived from earlier qualitative work. Participants also provided suggestions regarding strategies in open-ended comments. Listed strategies cut across the social ecological spectrum and represented both venues in which to reach men, and the content of violence prevention messaging. Results suggest that all strategies, on average, were perceived as effective across regions of the world, with strategies that tailor messaging to topics of particular concern to men (such as fatherhood and healthy relationships) rated most highly. Open-ended comments also surfaced tensions, particularly related to the role of a gender analysis in initial men's engagement efforts. Findings suggest the promise of cross-regional adaptation and information sharing regarding successful approaches to initiating men's anti-violence involvement.

  13. Preparation of solid-phase microextraction fibers by in-mold coating strategy for derivatization analysis of 24-epibrassinolide in pollen samples.

    PubMed

    Pan, Jialiang; Hu, Yuling; Liang, Tingan; Li, Gongke

    2012-11-02

    A novel and simple in-mold coating strategy was proposed for the preparation of uniform solid-phase microextraction (SPME) coatings. Such a strategy is based on the direct synthesis of the polymer coating on the surface of a solid fiber using a glass capillary as the mold. The capillary was removed and the polymer with well-controlled thickness could be coated on the silica fiber reproductively. Following the strategy, a new poly(acrylamide-co-ethylene glycol dimethacrylate) (poly(AM-co-EGDMA)) coating was prepared for the preconcentration of 24-epibrassinolide (24-epiBL) from plant matrix. The coating had the enrichment factor of 32 folds, and the extraction efficiency per unit thickness was 5 times higher than that of the commercial polydimethylsiloxane/divinylbenzene (PDMS/DVB) coating. A novel method based on SPME coupled with derivatization and large volume injection-high performance liquid chromatography (LVI-HPLC) was developed for the analysis of 24-epiBL. The linear range was 0.500-20.0 μg/L with the detection limit of 0.13 μg/L. The amounts of endogenous 24-epiBL in rape and sunflower breaking-wall pollens samples were determined with satisfactory recovery (77.8-104%) and reproducibility (3.9-7.9%). The SPME-DE/LVI-HPLC method is rapid, reliable, convenient and applicable for complicated plant samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Identification of Poly(ethylene glycol) and Poly(ethylene glycol)-Based Detergents Using Peptide Search Engines.

    PubMed

    Ahmadi, Shiva; Winter, Dominic

    2018-06-05

    Poly(ethylene glycol) (PEG) is one of the most common polymer contaminations in mass spectrometry (MS) samples. At present, the detection of PEG and other polymers relies largely on manual inspection of raw data, which is laborious and frequently difficult due to sample complexity and retention characteristics of polymer species in reversed-phase chromatography. We developed a new strategy for the automated identification of PEG molecules from tandem mass spectrometry (MS/MS) data using protein identification algorithms in combination with a database containing "PEG-proteins". Through definition of variable modifications, we extend the approach for the identification of commonly used PEG-based detergents. We exemplify the identification of different types of polymers by static nanoelectrospray tandem mass spectrometry (nanoESI-MS/MS) analysis of pure detergent solutions and data analysis using Mascot. Analysis of liquid chromatography-tandem mass spectrometry (LC-MS/MS) runs of a PEG-contaminated sample by Mascot identified 806 PEG spectra originating from four PEG species using a defined set of modifications covering PEG and common PEG-based detergents. Further characterization of the sample for unidentified PEG species using error-tolerant and mass-tolerant searches resulted in identification of 3409 and 3187 PEG-related MS/MS spectra, respectively. We further demonstrate the applicability of the strategy for Protein Pilot and MaxQuant.

  15. Recommended Immunological Strategies to Screen for Botulinum Neurotoxin-Containing Samples.

    PubMed

    Simon, Stéphanie; Fiebig, Uwe; Liu, Yvonne; Tierney, Rob; Dano, Julie; Worbs, Sylvia; Endermann, Tanja; Nevers, Marie-Claire; Volland, Hervé; Sesardic, Dorothea; Dorner, Martin B

    2015-11-26

    Botulinum neurotoxins (BoNTs) cause the life-threatening neurological illness botulism in humans and animals and are divided into seven serotypes (BoNT/A-G), of which serotypes A, B, E, and F cause the disease in humans. BoNTs are classified as "category A" bioterrorism threat agents and are relevant in the context of the Biological Weapons Convention. An international proficiency test (PT) was conducted to evaluate detection, quantification and discrimination capabilities of 23 expert laboratories from the health, food and security areas. Here we describe three immunological strategies that proved to be successful for the detection and quantification of BoNT/A, B, and E considering the restricted sample volume (1 mL) distributed. To analyze the samples qualitatively and quantitatively, the first strategy was based on sensitive immunoenzymatic and immunochromatographic assays for fast qualitative and quantitative analyses. In the second approach, a bead-based suspension array was used for screening followed by conventional ELISA for quantification. In the third approach, an ELISA plate format assay was used for serotype specific immunodetection of BoNT-cleaved substrates, detecting the activity of the light chain, rather than the toxin protein. The results provide guidance for further steps in quality assurance and highlight problems to address in the future.

  16. Recommended Immunological Strategies to Screen for Botulinum Neurotoxin-Containing Samples

    PubMed Central

    Simon, Stéphanie; Fiebig, Uwe; Liu, Yvonne; Tierney, Rob; Dano, Julie; Worbs, Sylvia; Endermann, Tanja; Nevers, Marie-Claire; Volland, Hervé; Sesardic, Dorothea; Dorner, Martin B.

    2015-01-01

    Botulinum neurotoxins (BoNTs) cause the life-threatening neurological illness botulism in humans and animals and are divided into seven serotypes (BoNT/A–G), of which serotypes A, B, E, and F cause the disease in humans. BoNTs are classified as “category A” bioterrorism threat agents and are relevant in the context of the Biological Weapons Convention. An international proficiency test (PT) was conducted to evaluate detection, quantification and discrimination capabilities of 23 expert laboratories from the health, food and security areas. Here we describe three immunological strategies that proved to be successful for the detection and quantification of BoNT/A, B, and E considering the restricted sample volume (1 mL) distributed. To analyze the samples qualitatively and quantitatively, the first strategy was based on sensitive immunoenzymatic and immunochromatographic assays for fast qualitative and quantitative analyses. In the second approach, a bead-based suspension array was used for screening followed by conventional ELISA for quantification. In the third approach, an ELISA plate format assay was used for serotype specific immunodetection of BoNT-cleaved substrates, detecting the activity of the light chain, rather than the toxin protein. The results provide guidance for further steps in quality assurance and highlight problems to address in the future. PMID:26703727

  17. Sciences Teacher Education Curriculum Re-Alignment: "Science Education Lecturers' Perspectives of Knowledge Integration at South African Universities"

    ERIC Educational Resources Information Center

    Booi, Kwanele; Khuzwayo, Mamsie Ethel

    2018-01-01

    A qualitative case study was conducted at six purposively sampled universities; out of a population of approximately 23 universities. This sampling strategy was based on selecting some universities that became Universities of Technology during the process of merging Higher Education Institutions (HEIs) while other universities kept their identity;…

  18. The Effectiveness of a Programme-Based Vocabulary Learning Strategies for Developing English Vocabulary for EFL Female Students at Taif University

    ERIC Educational Resources Information Center

    Ismaiel, Nasrah Mahmoud; Al Asmari, AbdulRahman Awadh

    2017-01-01

    The study of vocabulary can be considered a chief issue which the second language students encounter within the learning of another language especially, for non-English major students. This study aims at assessing the influence of a suggested program for enhancing EFL students` vocabulary and vocabulary learning strategies use. The sample of this…

  19. Victim Resistance in Child Sexual Abuse: A Look into the Efficacy of Self-Protection Strategies Based on the Offender's Experience

    ERIC Educational Resources Information Center

    Leclerc, Benoit; Wortley, Richard; Smallbone, Stephen

    2011-01-01

    This study examines the efficacy, as experienced by offenders with their victim, of self-protection strategies used in child sexual abuse cases. It also investigates whether the efficacy of self-protection varies according to victim characteristics. The sample consists of 94 adult offenders who sexually abused a single child and who agreed to…

  20. Organizational Conflict among Teachers and the Principal's Strategies of Dealing with It from the Teachers' Perspective in Schools of Jordan

    ERIC Educational Resources Information Center

    Jubran, Ali Mohammad

    2017-01-01

    This research aimed to evaluate the degree of organizational conflict among teachers in Jordanian schools, along with the principal strategies to deal with them effectively. The research population was based on a sample of 123 teachers from Irbid, Jordan for the academic year 2013/2014. A descriptive research approach was adopted with the help of…

  1. Enhancing adaptive sparse grid approximations and improving refinement strategies using adjoint-based a posteriori error estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, J.D., E-mail: jdjakem@sandia.gov; Wildey, T.

    2015-01-01

    In this paper we present an algorithm for adaptive sparse grid approximations of quantities of interest computed from discretized partial differential equations. We use adjoint-based a posteriori error estimates of the physical discretization error and the interpolation error in the sparse grid to enhance the sparse grid approximation and to drive adaptivity of the sparse grid. Utilizing these error estimates provides significantly more accurate functional values for random samples of the sparse grid approximation. We also demonstrate that alternative refinement strategies based upon a posteriori error estimates can lead to further increases in accuracy in the approximation over traditional hierarchicalmore » surplus based strategies. Throughout this paper we also provide and test a framework for balancing the physical discretization error with the stochastic interpolation error of the enhanced sparse grid approximation.« less

  2. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    PubMed

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  3. Clinical trial designs for testing biomarker-based personalized therapies

    PubMed Central

    Lai, Tze Leung; Lavori, Philip W; Shih, Mei-Chiung I; Sikic, Branimir I

    2014-01-01

    Background Advances in molecular therapeutics in the past decade have opened up new possibilities for treating cancer patients with personalized therapies, using biomarkers to determine which treatments are most likely to benefit them, but there are difficulties and unresolved issues in the development and validation of biomarker-based personalized therapies. We develop a new clinical trial design to address some of these issues. The goal is to capture the strengths of the frequentist and Bayesian approaches to address this problem in the recent literature and to circumvent their limitations. Methods We use generalized likelihood ratio tests of the intersection null and enriched strategy null hypotheses to derive a novel clinical trial design for the problem of advancing promising biomarker-guided strategies toward eventual validation. We also investigate the usefulness of adaptive randomization (AR) and futility stopping proposed in the recent literature. Results Simulation studies demonstrate the advantages of testing both the narrowly focused enriched strategy null hypothesis related to validating a proposed strategy and the intersection null hypothesis that can accommodate to a potentially successful strategy. AR and early termination of ineffective treatments offer increased probability of receiving the preferred treatment and better response rates for patients in the trial, at the expense of more complicated inference under small-to-moderate total sample sizes and some reduction in power. Limitations The binary response used in the development phase may not be a reliable indicator of treatment benefit on long-term clinical outcomes. In the proposed design, the biomarker-guided strategy (BGS) is not compared to ‘standard of care’, such as physician’s choice that may be informed by patient characteristics. Therefore, a positive result does not imply superiority of the BGS to ‘standard of care’. The proposed design and tests are valid asymptotically. Simulations are used to examine small-to-moderate sample properties. Conclusion Innovative clinical trial designs are needed to address the difficulties and issues in the development and validation of biomarker-based personalized therapies. The article shows the advantages of using likelihood inference and interim analysis to meet the challenges in the sample size needed and in the constantly evolving biomarker landscape and genomic and proteomic technologies. PMID:22397801

  4. Maintaining and Enhancing Diversity of Sampled Protein Conformations in Robotics-Inspired Methods.

    PubMed

    Abella, Jayvee R; Moll, Mark; Kavraki, Lydia E

    2018-01-01

    The ability to efficiently sample structurally diverse protein conformations allows one to gain a high-level view of a protein's energy landscape. Algorithms from robot motion planning have been used for conformational sampling, and several of these algorithms promote diversity by keeping track of "coverage" in conformational space based on the local sampling density. However, large proteins present special challenges. In particular, larger systems require running many concurrent instances of these algorithms, but these algorithms can quickly become memory intensive because they typically keep previously sampled conformations in memory to maintain coverage estimates. In addition, robotics-inspired algorithms depend on defining useful perturbation strategies for exploring the conformational space, which is a difficult task for large proteins because such systems are typically more constrained and exhibit complex motions. In this article, we introduce two methodologies for maintaining and enhancing diversity in robotics-inspired conformational sampling. The first method addresses algorithms based on coverage estimates and leverages the use of a low-dimensional projection to define a global coverage grid that maintains coverage across concurrent runs of sampling. The second method is an automatic definition of a perturbation strategy through readily available flexibility information derived from B-factors, secondary structure, and rigidity analysis. Our results show a significant increase in the diversity of the conformations sampled for proteins consisting of up to 500 residues when applied to a specific robotics-inspired algorithm for conformational sampling. The methodologies presented in this article may be vital components for the scalability of robotics-inspired approaches.

  5. Identification of problems in search strategies in Cochrane Reviews.

    PubMed

    Franco, Juan Víctor Ariel; Garrote, Virginia Laura; Escobar Liquitay, Camila Micaela; Vietto, Valeria

    2018-05-15

    Search strategies are essential for the adequate retrieval of studies in a systematic review (SR). Our objective was to identify problems in the design and reporting of search strategies in a sample of new Cochrane SRs first published in The Cochrane Library in 2015. We took a random sample of 70 new Cochrane SRs of interventions published in 2015. We evaluated their design and reporting of search strategies using the recommendations from the Cochrane Handbook for Systematic Reviews of Interventions, the Methodological Expectations of Cochrane Intervention Reviews, and the Peer Review of Electronic Search Strategies evidence-based guideline. Most reviews complied with the reporting standards in the Cochrane Handbook and the Methodological Expectations of Cochrane Intervention Reviews; however, 8 SRs did not search trials registers, 3 SRs included language restrictions, and there was inconsistent reporting of contact with individuals and searches of the gray literature. We found problems in the design of the search strategies in 73% of reviews (95% CI, 60-84%) and 53% of these contained problems (95% CI, 38-69%) that could limit both the sensitivity and precision of the search strategies. We found limitations in the design and reporting of search strategies. We consider that a greater adherence to the guidelines could improve their quality. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Automation of DNA and miRNA co-extraction for miRNA-based identification of human body fluids and tissues.

    PubMed

    Kulstein, Galina; Marienfeld, Ralf; Miltner, Erich; Wiegand, Peter

    2016-10-01

    In the last years, microRNA (miRNA) analysis came into focus in the field of forensic genetics. Yet, no standardized and recommendable protocols for co-isolation of miRNA and DNA from forensic relevant samples have been developed so far. Hence, this study evaluated the performance of an automated Maxwell® 16 System-based strategy (Promega) for co-extraction of DNA and miRNA from forensically relevant (blood and saliva) samples compared to (semi-)manual extraction methods. Three procedures were compared on the basis of recovered quantity of DNA and miRNA (as determined by real-time PCR and Bioanalyzer), miRNA profiling (shown by Cq values and extraction efficiency), STR profiles, duration, contamination risk and handling. All in all, the results highlight that the automated co-extraction procedure yielded the highest miRNA and DNA amounts from saliva and blood samples compared to both (semi-)manual protocols. Also, for aged and genuine samples of forensically relevant traces the miRNA and DNA yields were sufficient for subsequent downstream analysis. Furthermore, the strategy allows miRNA extraction only in cases where it is relevant to obtain additional information about the sample type. Besides, this system enables flexible sample throughput and labor-saving sample processing with reduced risk of cross-contamination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. a Comparison of Two Strategies for Avoiding Negative Transfer in Domain Adaptation Based on Logistic Regression

    NASA Astrophysics Data System (ADS)

    Paul, A.; Vogt, K.; Rottensteiner, F.; Ostermann, J.; Heipke, C.

    2018-05-01

    In this paper we deal with the problem of measuring the similarity between training and tests datasets in the context of transfer learning (TL) for image classification. TL tries to transfer knowledge from a source domain, where labelled training samples are abundant but the data may follow a different distribution, to a target domain, where labelled training samples are scarce or even unavailable, assuming that the domains are related. Thus, the requirements w.r.t. the availability of labelled training samples in the target domain are reduced. In particular, if no labelled target data are available, it is inherently difficult to find a robust measure of relatedness between the source and target domains. This is of crucial importance for the performance of TL, because the knowledge transfer between unrelated data may lead to negative transfer, i.e. to a decrease of classification performance after transfer. We address the problem of measuring the relatedness between source and target datasets and investigate three different strategies to predict and, consequently, to avoid negative transfer in this paper. The first strategy is based on circular validation. The second strategy relies on the Maximum Mean Discrepancy (MMD) similarity metric, whereas the third one is an extension of MMD which incorporates the knowledge about the class labels in the source domain. Our method is evaluated using two different benchmark datasets. The experiments highlight the strengths and weaknesses of the investigated methods. We also show that it is possible to reduce the amount of negative transfer using these strategies for a TL method and to generate a consistent performance improvement over the whole dataset.

  8. School Counselors' and School Psychologists' Bullying Prevention and Intervention Strategies: A Look into Real-World Practices

    ERIC Educational Resources Information Center

    Lund, Emily M.; Blake, Jamilia J.; Ewing, Heidi K.; Banks, Courtney S.

    2012-01-01

    A sample of 560 school psychologists and school counselors completed a Web-based survey regarding bullying in their schools, related training, and interventions used. Few school-based mental health professionals used evidence-based bullying interventions or were involved in the selection of interventions for their school, and administrators were…

  9. Automated sample area definition for high-throughput microscopy.

    PubMed

    Zeder, M; Ellrott, A; Amann, R

    2011-04-01

    High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.

  10. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

    PubMed

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly

    2015-09-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  11. Public attitudes toward stuttering in Turkey: probability versus convenience sampling.

    PubMed

    Ozdemir, R Sertan; St Louis, Kenneth O; Topbaş, Seyhun

    2011-12-01

    A Turkish translation of the Public Opinion Survey of Human Attributes-Stuttering (POSHA-S) was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. A convenience sample of adults in Eskişehir, Turkey was compared with two replicates of a school-based, probability cluster sampling scheme. The two replicates of the probability sampling scheme yielded similar demographic samples, both of which were different from the convenience sample. Components of subscores on the POSHA-S were significantly different in more than half of the comparisons between convenience and probability samples, indicating important differences in public attitudes. If POSHA-S users intend to generalize to specific geographic areas, results of this study indicate that probability sampling is a better research strategy than convenience sampling. The reader will be able to: (1) discuss the difference between convenience sampling and probability sampling; (2) describe a school-based probability sampling scheme; and (3) describe differences in POSHA-S results from convenience sampling versus probability sampling. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Using continuous monitoring of physical parameters to better estimate phosphorus fluxes in a small agricultural catchment

    NASA Astrophysics Data System (ADS)

    Minaudo, Camille; Dupas, Rémi; Moatar, Florentina; Gascuel-Odoux, Chantal

    2016-04-01

    Phosphorus fluxes in streams are subjected to high temporal variations, questioning the relevance of the monitoring strategies (generally monthly sampling) chosen to assist EU Directives to capture phosphorus fluxes and their variations over time. The objective of this study was to estimate the annual and seasonal P flux uncertainties depending on several monitoring strategies, with varying sampling frequencies, but also taking into account simultaneous and continuous time-series of parameters such as turbidity, conductivity, groundwater level and precipitation. Total Phosphorus (TP), Soluble Reactive Phosphorus (SRP) and Total Suspended Solids (TSS) concentrations were surveyed at a fine temporal frequency between 2007 and 2015 at the outlet of a small agricultural catchment in Brittany (Naizin, 5 km2). Sampling occurred every 3 to 6 days between 2007 and 2012 and daily between 2013 and 2015. Additionally, 61 storms were intensively surveyed (1 sample every 30 minutes) since 2007. Besides, water discharge, turbidity, conductivity, groundwater level and precipitation were monitored on a sub-hourly basis. A strong temporal decoupling between SRP and particulate P (PP) was found (Dupas et al., 2015). The phosphorus-discharge relationships displayed two types of hysteretic patterns (clockwise and counterclockwise). For both cases, time-series of PP and SRP were estimated continuously for the whole period using an empirical model linking P concentrations with the hydrological and physic-chemical variables. The associated errors of the estimated P concentrations were also assessed. These « synthetic » PP and SRP time-series allowed us to discuss the most efficient monitoring strategies, first taking into account different sampling strategies based on Monte Carlo random simulations, and then adding the information from continuous data such as turbidity, conductivity and groundwater depth based on empirical modelling. Dupas et al., (2015, Distinct export dynamics for dissolved and particulate phosphorus reveal independent transport mechanisms in an arable headwater catchment, Hydrological Processes, 29(14), 3162-3178

  13. Loss of control eating in adolescents: Associations with adaptive and maladaptive emotion regulation strategies.

    PubMed

    Goossens, Lien; Van Malderen, Eva; Van Durme, Kim; Braet, Caroline

    2016-08-01

    To examine differences in the use of emotion regulation strategies in adolescents with and without loss of control over eating (LOC). A community-based sample of 524 adolescents from 12 to 18years old (70.6% girls; Mage=15.08; SD=1.59) reported on LOC and the use of several maladaptive and adaptive emotion regulation strategies. Adolescents who experience LOC (28%) report more use of maladaptive strategies. With regard to adaptive strategies a significant group X gender interaction effect was found with girls who report LOC using less adaptive strategies. More specifically, based on the FEEL-KJ less problem-oriented action, distraction, humor enhancement, acceptance and cognitive problem solving were observed in girls who report LOC compared to those who do not report LOC. Boys with LOC report more use of adaptive strategies compared to those who do not report LOC. LOC in adolescents is associated with increased use of maladaptive emotion regulation strategies as well as a decreased use of adaptive strategies although the latter is only specific for girls. These results may inform prevention and treatment of emotion regulation problems in adolescents with LOC. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A new strategy for array optimization applied to Brazilian Decimetric Array

    NASA Astrophysics Data System (ADS)

    Faria, C.; Stephany, S.; Sawant, H. S.

    Radio interferometric arrays measure the Fourier transform of the sky brightness distribution in a finite set of points that are determined by the cross-correlation of different pairs of antennas of the array The sky brightness distribution is reconstructed by the inverse Fourier transform of the sampled visibilities The quality of the reconstructed images strongly depends on the array configuration since it determines the sampling function and therefore the points in the Fourier Plane This work proposes a new optimization strategy for the array configuration that is based on the entropy of the distribution of the samples points in the Fourier plane A stochastic optimizer the Ant Colony Optimization employs entropy of the point distribution in the Fourier plane to iteratively refine the candidate solutions The proposed strategy was developed for the Brazilian Decimetric Array BDA a radio interferometric array that is currently being developed for solar observations at the Brazilian Institute for Space Research Configurations results corresponding to the Fourier plane coverage synthesized beam and side lobes levels are shown for an optimized BDA configuration obtained with the proposed strategy and compared to the results for a standard T array configuration that was originally proposed

  15. A New Treatment Strategy for Inactivating Algae in Ballast Water Based on Multi-Trial Injections of Chlorine

    PubMed Central

    Sun, Jinyang; Wang, Junsheng; Pan, Xinxiang; Yuan, Haichao

    2015-01-01

    Ships’ ballast water can carry aquatic organisms into foreign ecosystems. In our previous studies, a concept using ion exchange membrane electrolysis to treat ballast water has been proven. In addition to other substantial approaches, a new strategy for inactivating algae is proposed based on the developed ballast water treatment system. In the new strategy, the means of multi-trial injection with small doses of electrolytic products is applied for inactivating algae. To demonstrate the performance of the new strategy, contrast experiments between new strategies and routine processes were conducted. Four algae species including Chlorella vulgaris, Platymonas subcordiformis, Prorocentrum micans and Karenia mikimotoi were chosen as samples. The different experimental parameters are studied including the injection times and doses of electrolytic products. Compared with the conventional one trial injection method, mortality rate time (MRT) and available chlorine concentration can be saved up to about 84% and 40%, respectively, under the application of the new strategy. The proposed new approach has great potential in practical ballast water treatment. Furthermore, the strategy is also helpful for deep insight of mechanism of algal tolerance. PMID:26068239

  16. Event-triggered synchronization for reaction-diffusion complex networks via random sampling

    NASA Astrophysics Data System (ADS)

    Dong, Tao; Wang, Aijuan; Zhu, Huiyun; Liao, Xiaofeng

    2018-04-01

    In this paper, the synchronization problem of the reaction-diffusion complex networks (RDCNs) with Dirichlet boundary conditions is considered, where the data is sampled randomly. An event-triggered controller based on the sampled data is proposed, which can reduce the number of controller and the communication load. Under this strategy, the synchronization problem of the diffusion complex network is equivalently converted to the stability of a of reaction-diffusion complex dynamical systems with time delay. By using the matrix inequality technique and Lyapunov method, the synchronization conditions of the RDCNs are derived, which are dependent on the diffusion term. Moreover, it is found the proposed control strategy can get rid of the Zeno behavior naturally. Finally, a numerical example is given to verify the obtained results.

  17. Proteomic analysis of tissue samples in translational breast cancer research.

    PubMed

    Gromov, Pavel; Moreira, José M A; Gromova, Irina

    2014-06-01

    In the last decade, many proteomic technologies have been applied, with varying success, to the study of tissue samples of breast carcinoma for protein expression profiling in order to discover protein biomarkers/signatures suitable for: characterization and subtyping of tumors; early diagnosis, and both prognosis and prediction of outcome of chemotherapy. The purpose of this review is to critically appraise what has been achieved to date using proteomic technologies and to bring forward novel strategies - based on the analysis of clinically relevant samples - that promise to accelerate the translation of basic discoveries into the daily breast cancer clinical practice. In particular, we address major issues in experimental design by reviewing the strengths and weaknesses of current proteomic strategies in the context of the analysis of human breast tissue specimens.

  18. Determination of Minimum Training Sample Size for Microarray-Based Cancer Outcome Prediction–An Empirical Assessment

    PubMed Central

    Cheng, Ningtao; Wu, Leihong; Cheng, Yiyu

    2013-01-01

    The promise of microarray technology in providing prediction classifiers for cancer outcome estimation has been confirmed by a number of demonstrable successes. However, the reliability of prediction results relies heavily on the accuracy of statistical parameters involved in classifiers. It cannot be reliably estimated with only a small number of training samples. Therefore, it is of vital importance to determine the minimum number of training samples and to ensure the clinical value of microarrays in cancer outcome prediction. We evaluated the impact of training sample size on model performance extensively based on 3 large-scale cancer microarray datasets provided by the second phase of MicroArray Quality Control project (MAQC-II). An SSNR-based (scale of signal-to-noise ratio) protocol was proposed in this study for minimum training sample size determination. External validation results based on another 3 cancer datasets confirmed that the SSNR-based approach could not only determine the minimum number of training samples efficiently, but also provide a valuable strategy for estimating the underlying performance of classifiers in advance. Once translated into clinical routine applications, the SSNR-based protocol would provide great convenience in microarray-based cancer outcome prediction in improving classifier reliability. PMID:23861920

  19. A radial sampling strategy for uniform k-space coverage with retrospective respiratory gating in 3D ultrashort-echo-time lung imaging.

    PubMed

    Park, Jinil; Shin, Taehoon; Yoon, Soon Ho; Goo, Jin Mo; Park, Jang-Yeon

    2016-05-01

    The purpose of this work was to develop a 3D radial-sampling strategy which maintains uniform k-space sample density after retrospective respiratory gating, and demonstrate its feasibility in free-breathing ultrashort-echo-time lung MRI. A multi-shot, interleaved 3D radial sampling function was designed by segmenting a single-shot trajectory of projection views such that each interleaf samples k-space in an incoherent fashion. An optimal segmentation factor for the interleaved acquisition was derived based on an approximate model of respiratory patterns such that radial interleaves are evenly accepted during the retrospective gating. The optimality of the proposed sampling scheme was tested by numerical simulations and phantom experiments using human respiratory waveforms. Retrospectively, respiratory-gated, free-breathing lung MRI with the proposed sampling strategy was performed in healthy subjects. The simulation yielded the most uniform k-space sample density with the optimal segmentation factor, as evidenced by the smallest standard deviation of the number of neighboring samples as well as minimal side-lobe energy in the point spread function. The optimality of the proposed scheme was also confirmed by minimal image artifacts in phantom images. Human lung images showed that the proposed sampling scheme significantly reduced streak and ring artifacts compared with the conventional retrospective respiratory gating while suppressing motion-related blurring compared with full sampling without respiratory gating. In conclusion, the proposed 3D radial-sampling scheme can effectively suppress the image artifacts due to non-uniform k-space sample density in retrospectively respiratory-gated lung MRI by uniformly distributing gated radial views across the k-space. Copyright © 2016 John Wiley & Sons, Ltd.

  20. RNA-Stabilized Whole Blood Samples but Not Peripheral Blood Mononuclear Cells Can Be Stored for Prolonged Time Periods Prior to Transcriptome Analysis

    PubMed Central

    Debey-Pascher, Svenja; Hofmann, Andrea; Kreusch, Fatima; Schuler, Gerold; Schuler-Thurner, Beatrice; Schultze, Joachim L.; Staratschek-Jox, Andrea

    2011-01-01

    Microarray-based transcriptome analysis of peripheral blood as surrogate tissue has become an important approach in clinical implementations. However, application of gene expression profiling in routine clinical settings requires careful consideration of the influence of sample handling and RNA isolation methods on gene expression profile outcome. We evaluated the effect of different sample preservation strategies (eg, cryopreservation of peripheral blood mononuclear cells or freezing of PAXgene-stabilized whole blood samples) on gene expression profiles. Expression profiles obtained from cryopreserved peripheral blood mononuclear cells differed substantially from those of their nonfrozen counterpart samples. Furthermore, expression profiles in cryopreserved peripheral blood mononuclear cell samples were found to undergo significant alterations with increasing storage period, whereas long-term freezing of PAXgene RNA stabilized whole blood samples did not significantly affect stability of gene expression profiles. This report describes important technical aspects contributing toward the establishment of robust and reliable guidance for gene expression studies using peripheral blood and provides a promising strategy for reliable implementation in routine handling for diagnostic purposes. PMID:21704280

  1. Modified graphene oxide sensors for ultra-sensitive detection of nitrate ions in water.

    PubMed

    Ren, Wen; Mura, Stefania; Irudayaraj, Joseph M K

    2015-10-01

    Nitrate ions is a very common contaminant in drinking water and has a significant impact on the environment, necessitating routine monitoring. Due to its chemical and physical properties, it is hard to directly detect nitrate ions with high sensitivity in a simple and inexpensive manner. Herein with amino group modified graphene oxide (GO) as a sensing element, we show a direct and ultra-sensitive method to detect nitrate ions, at a lowest detected concentration of 5 nM in river water samples, much lower than the reported methods based on absorption spectroscopy. Furthermore, unlike the reported strategies based on absorption spectroscopy wherein the nitrate concentration is determined by monitoring an increase in aggregation of gold nanoparticles (GNPs), our method evaluates the concentration of nitrate ions based on reduction in aggregation of GNPs for monitoring in real samples. To improve sensitivity, several optimizations were performed, including the assessment of the amount of modified GO required, concentration of GNPs and incubation time. The detection methodology was characterized by zeta potential, TEM and SEM. Our results indicate that an enrichment of modified GO with nitrate ions contributed to excellent sensitivity and the entire detection procedure could be completed within 75 min with only 20 μl of sample. This simple and rapid methodology was applied to monitor nitrate ions in real samples with excellent sensitivity and minimum pretreatment. The proposed approach paves the way for a novel means to detect anions in real samples and highlights the potential of GO based detection strategy for water quality monitoring. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Validation of a Sampling Method to Collect Exposure Data for Central-Line-Associated Bloodstream Infections.

    PubMed

    Hammami, Naïma; Mertens, Karl; Overholser, Rosanna; Goetghebeur, Els; Catry, Boudewijn; Lambert, Marie-Laurence

    2016-05-01

    Surveillance of central-line-associated bloodstream infections requires the labor-intensive counting of central-line days (CLDs). This workload could be reduced by sampling. Our objective was to evaluate the accuracy of various sampling strategies in the estimation of CLDs in intensive care units (ICUs) and to establish a set of rules to identify optimal sampling strategies depending on ICU characteristics. Analyses of existing data collected according to the European protocol for patient-based surveillance of ICU-acquired infections in Belgium between 2004 and 2012. CLD data were reported by 56 ICUs in 39 hospitals during 364 trimesters. We compared estimated CLD data obtained from weekly and monthly sampling schemes with the observed exhaustive CLD data over the trimester by assessing the CLD percentage error (ie, observed CLDs - estimated CLDs/observed CLDs). We identified predictors of improved accuracy using linear mixed models. When sampling once per week or 3 times per month, 80% of ICU trimesters had a CLD percentage error within 10%. When sampling twice per week, this was >90% of ICU trimesters. Sampling on Tuesdays provided the best estimations. In the linear mixed model, the observed CLD count was the best predictor for a smaller percentage error. The following sampling strategies provided an estimate within 10% of the actual CLD for 97% of the ICU trimesters with 90% confidence: 3 times per month in an ICU with >650 CLDs per trimester or each Tuesday in an ICU with >480 CLDs per trimester. Sampling of CLDs provides an acceptable alternative to daily collection of CLD data.

  3. Standard operating procedures for collection of soil and sediment samples for the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study

    USGS Publications Warehouse

    Fisher, Shawn C.; Reilly, Timothy J.; Jones, Daniel K.; Benzel, William M.; Griffin, Dale W.; Loftin, Keith A.; Iwanowicz, Luke R.; Cohl, Jonathan A.

    2015-12-17

    An understanding of the effects on human and ecological health brought by major coastal storms or flooding events is typically limited because of a lack of regionally consistent baseline and trends data in locations proximal to potential contaminant sources and mitigation activities, sensitive ecosystems, and recreational facilities where exposures are probable. In an attempt to close this gap, the U.S. Geological Survey (USGS) has implemented the Sediment-bound Contaminant Resiliency and Response (SCoRR) strategy pilot study to collect regional sediment-quality data prior to and in response to future coastal storms. The standard operating procedure (SOP) detailed in this document serves as the sample-collection protocol for the SCoRR strategy by providing step-by-step instructions for site preparation, sample collection and processing, and shipping of soil and surficial sediment (for example, bed sediment, marsh sediment, or beach material). The objectives of the SCoRR strategy pilot study are (1) to create a baseline of soil-, sand-, marsh sediment-, and bed-sediment-quality data from sites located in the coastal counties from Maine to Virginia based on their potential risk of being contaminated in the event of a major coastal storm or flooding (defined as Resiliency mode); and (2) respond to major coastal storms and flooding by reoccupying select baseline sites and sampling within days of the event (defined as Response mode). For both modes, samples are collected in a consistent manner to minimize bias and maximize quality control by ensuring that all sampling personnel across the region collect, document, and process soil and sediment samples following the procedures outlined in this SOP. Samples are analyzed using four USGS-developed screening methods—inorganic geochemistry, organic geochemistry, pathogens, and biological assays—which are also outlined in this SOP. Because the SCoRR strategy employs a multi-metric approach for sample analyses, this protocol expands upon and reconciles differences in the sample collection protocols outlined in the USGS “National Field Manual for the Collection of Water-Quality Data,” which should be used in conjunction with this SOP. A new data entry and sample tracking system also is presented to ensure all relevant data and metadata are gathered at the sample locations and in the laboratories.

  4. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  5. An implementation of differential evolution algorithm for inversion of geoelectrical data

    NASA Astrophysics Data System (ADS)

    Balkaya, Çağlayan

    2013-11-01

    Differential evolution (DE), a population-based evolutionary algorithm (EA) has been implemented to invert self-potential (SP) and vertical electrical sounding (VES) data sets. The algorithm uses three operators including mutation, crossover and selection similar to genetic algorithm (GA). Mutation is the most important operator for the success of DE. Three commonly used mutation strategies including DE/best/1 (strategy 1), DE/rand/1 (strategy 2) and DE/rand-to-best/1 (strategy 3) were applied together with a binomial type crossover. Evolution cycle of DE was realized without boundary constraints. For the test studies performed with SP data, in addition to both noise-free and noisy synthetic data sets two field data sets observed over the sulfide ore body in the Malachite mine (Colorado) and over the ore bodies in the Neem-Ka Thana cooper belt (India) were considered. VES test studies were carried out using synthetically produced resistivity data representing a three-layered earth model and a field data set example from Gökçeada (Turkey), which displays a seawater infiltration problem. Mutation strategies mentioned above were also extensively tested on both synthetic and field data sets in consideration. Of these, strategy 1 was found to be the most effective strategy for the parameter estimation by providing less computational cost together with a good accuracy. The solutions obtained by DE for the synthetic cases of SP were quite consistent with particle swarm optimization (PSO) which is a more widely used population-based optimization algorithm than DE in geophysics. Estimated parameters of SP and VES data were also compared with those obtained from Metropolis-Hastings (M-H) sampling algorithm based on simulated annealing (SA) without cooling to clarify uncertainties in the solutions. Comparison to the M-H algorithm shows that DE performs a fast approximate posterior sampling for the case of low-dimensional inverse geophysical problems.

  6. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research

    PubMed Central

    Palinkas, Lawrence A.; Horwitz, Sarah M.; Green, Carla A.; Wisdom, Jennifer P.; Duan, Naihua; Hoagwood, Kimberly

    2013-01-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research. PMID:24193818

  7. Household economic strategies and nutritional anthropometry of women in American Samoa and highland Bolivia.

    PubMed

    Bindon, James R; Vitzthum, Virginia J

    2002-04-01

    This study compares findings from research projects involving different genetic, environmental, and cultural contexts: a study of lifestyle and health from American Samoa (ASLS) and the Bolivian project. Reproduction and Ecology in Provincia Aroma (REPA). This paper presents analyses of varying economic strategies and their association with nutritional status indicators in each population. The ASLS sample includes 66 Samoan women and the REPA sample includes 210 Aymara women. Principle components analysis of household economic resources within each sample extracted two significant factors: one represents modernizing influences including education and occupational status, and the other represents ethnographically salient traditional economic behavior. The traditional pattern includes adding household members in Samoa and selling agricultural products in Bolivia. This analysis places each woman along two continua, traditional and modern, based on her household mobilization of economic resources, permitting an understanding of the patterns underlying household economic behavior that is not possible in univariate analyses of socioeconomic variables. For the Bolivian women the strategy involving more education and higher occupational status was associated with higher measures of several nutritional status indicators, including body mass index, arm muscle area, and peripheral skinfolds. But among the Samoan women, where substantial obesity was the norm, there were no significant differences in anthropometric measurements based on economic strategies. These data argue for the importance of directly measuring the potential consequences of variation in household economic strategies rather than merely inferring such, and of assessing ethnographically relevant aspects of household economic production rather than limiting analyses to non-context-specific economic indicators such as income. This focus on household strategy is likely to be fruitful especially where economic and nutritional conditions are marginal. The findings from Bolivia also support efforts in developing countries to improve girls' education, and thereby occupational prospects, as a means to improve their health status as women.

  8. Investigating the Effectiveness of Teaching Methods Based on a Four-Step Constructivist Strategy

    NASA Astrophysics Data System (ADS)

    Çalik, Muammer; Ayas, Alipaşa; Coll, Richard K.

    2010-02-01

    This paper reports on an investigation of the effectiveness an intervention using several different methods for teaching solution chemistry. The teaching strategy comprised a four-step approach derived from a constructivist view of learning. A sample consisting of 44 students (18 boys and 26 girls) was selected purposively from two different Grade 9 classes in the city of Trabzon, Turkey. Data collection employed a purpose-designed `solution chemistry concept test', consisting of 17 items, with the quantitative data from the survey supported by qualitative interview data. The findings suggest that using different methods embedded within the four-step constructivist-based teaching strategy enables students to refute some alternative conceptions, but does not completely eliminate student alternative conceptions for solution chemistry.

  9. The price elasticity of demand for heroin: Matched longitudinal and experimental evidence.

    PubMed

    Olmstead, Todd A; Alessi, Sheila M; Kline, Brendan; Pacula, Rosalie Liccardo; Petry, Nancy M

    2015-05-01

    This paper reports estimates of the price elasticity of demand for heroin based on a newly constructed dataset. The dataset has two matched components concerning the same sample of regular heroin users: longitudinal information about real-world heroin demand (actual price and actual quantity at daily intervals for each heroin user in the sample) and experimental information about laboratory heroin demand (elicited by presenting the same heroin users with scenarios in a laboratory setting). Two empirical strategies are used to estimate the price elasticity of demand for heroin. The first strategy exploits the idiosyncratic variation in the price experienced by a heroin user over time that occurs in markets for illegal drugs. The second strategy exploits the experimentally induced variation in price experienced by a heroin user across experimental scenarios. Both empirical strategies result in the estimate that the conditional price elasticity of demand for heroin is approximately -0.80. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Mindfulness as a Coping Strategy for Bias-Based School Victimization among Latina/o Sexual Minority Youth

    PubMed Central

    Toomey, Russell B.; Anhalt, Karla

    2016-01-01

    This study examined whether mindfulness strategies (e.g., acting non-judgmentally with awareness and attention to present events) were effective in mitigating the associations among school-based victimization related to ethnicity and sexual orientation, well-being (i.e., depressive symptoms and self-esteem), and grade-point average (GPA). The U.S.-based sample included 236 Latina/o sexual minority students, ranging in age from 14 to 24 years (47% were enrolled in secondary schools, 53% in postsecondary schools). Results from structural equation modeling revealed that ethnicity-based school victimization was negatively associated with GPA but not well-being. However, sexual orientation-based victimization was not associated with well-being or GPA. Mindfulness was positively associated with well-being but not GPA. High levels of mindfulness coping were protective when the stressor was sexual orientation-based victimization but not ethnicity-based school victimization. These findings contribute to a growing literature documenting the unique school barriers experienced by Latina/o sexual minority youth and highlight the promising utility of mindfulness-based intervention strategies for coping with minority stress. PMID:28018933

  11. An Examination of the Tripartite Model of Depressive and Anxiety Symptoms in African American Youth: Stressors and Coping Strategies as Common and Specific Correlates

    ERIC Educational Resources Information Center

    Gaylord-Harden, Noni K.; Elmore, Corinn A.; Campbell, Cynthya L.; Wethington, Anna

    2011-01-01

    The purpose of the current study was to examine the tripartite model of depression and anxiety in a community-based sample of 278 African American adolescents (M age = 12.89) from low-income communities and to identify stressors and coping strategies that were associated with the specific features of each disorder. Participants reported on…

  12. Recent Advances in On-Line Methods Based on Extraction for Speciation Analysis of Chromium in Environmental Matrices.

    PubMed

    Trzonkowska, Laura; Leśniewska, Barbara; Godlewska-Żyłkiewicz, Beata

    2016-07-03

    The biological activity of Cr(III) and Cr(VI) species, their chemical behavior, and toxic effects are dissimilar. The speciation analysis of Cr(III) and Cr(VI) in environmental matrices is then of great importance and much research has been devoted to this area. This review presents recent developments in on-line speciation analysis of chromium in such samples. Flow systems have proved to be excellent tools for automation of sample pretreatment, separation/preconcentration of chromium species, and their detection by various instrumental techniques. Analytical strategies used in chromium speciation analysis discussed in this review are divided into categories based on selective extraction/separation of chromium species on solid sorbents and liquid-liquid extraction of chromium species. The most popular strategy is that based on solid-phase extraction. Therefore, this review shows the potential of novel materials designed and used for selective binding of chromium species. The progress in miniaturization of measurement systems is also presented.

  13. A robust high resolution reversed-phase HPLC strategy to investigate various metabolic species in different biological models.

    PubMed

    D'Alessandro, Angelo; Gevi, Federica; Zolla, Lello

    2011-04-01

    Recent advancements in the field of omics sciences have paved the way for further expansion of metabolomics. Originally tied to NMR spectroscopy, metabolomic disciplines are constantly and growingly involving HPLC and mass spectrometry (MS)-based analytical strategies and, in this context, we hereby propose a robust and efficient extraction protocol for metabolites from four different biological sources which are subsequently analysed, identified and quantified through high resolution reversed-phase fast HPLC and mass spectrometry. To this end, we demonstrate the elevated intra- and inter-day technical reproducibility, ease of an MRM-based MS method, allowing simultaneous detection of up to 10 distinct features, and robustness of multiple metabolite detection and quantification in four different biological samples. This strategy might become routinely applicable to various samples/biological matrices, especially for low-availability ones. In parallel, we compare the present strategy for targeted detection of a representative metabolite, L-glutamic acid, with our previously-proposed chemical-derivatization through dansyl chloride. A direct comparison of the present method against spectrophotometric assays is proposed as well. An application of the proposed method is also introduced, using the SAOS-2 cell line, either induced or non-induced to express the TAp63 isoform of the p63 gene, as a model for determination of variations of glutamate concentrations.

  14. Model For Marketing Strategy Decision Based On Multicriteria Decicion Making: A Case Study In Batik Madura Industry

    NASA Astrophysics Data System (ADS)

    Anna, I. D.; Cahyadi, I.; Yakin, A.

    2018-01-01

    Selection of marketing strategy is a prominent competitive advantage for small and medium enterprises business development. The selection process is is a multiple criteria decision-making problem, which includes evaluation of various attributes or criteria in a process of strategy formulation. The objective of this paper is to develop a model for the selection of a marketing strategy in Batik Madura industry. The current study proposes an integrated approach based on analytic network process (ANP) and technique for order preference by similarity to ideal solution (TOPSIS) to determine the best strategy for Batik Madura marketing problems. Based on the results of group decision-making technique, this study selected fourteen criteria, including consistency, cost, trend following, customer loyalty, business volume, uniqueness manpower, customer numbers, promotion, branding, bussiness network, outlet location, credibility and the inovation as Batik Madura marketing strategy evaluation criteria. A survey questionnaire developed from literature review was distributed to a sample frame of Batik Madura SMEs in Pamekasan. In the decision procedure step, expert evaluators were asked to establish the decision matrix by comparing the marketing strategy alternatives under each of the individual criteria. Then, considerations obtained from ANP and TOPSIS methods were applied to build the specific criteria constraints and range of the launch strategy in the model. The model in this study demonstrates that, under current business situation, Straight-focus marketing strategy is the best marketing strategy for Batik Madura SMEs in Pamekasan.

  15. A new modeling strategy for third-order fast high-performance liquid chromatographic data with fluorescence detection. Quantitation of fluoroquinolones in water samples.

    PubMed

    Alcaráz, Mirta R; Bortolato, Santiago A; Goicoechea, Héctor C; Olivieri, Alejandro C

    2015-03-01

    Matrix augmentation is regularly employed in extended multivariate curve resolution-alternating least-squares (MCR-ALS), as applied to analytical calibration based on second- and third-order data. However, this highly useful concept has almost no correspondence in parallel factor analysis (PARAFAC) of third-order data. In the present work, we propose a strategy to process third-order chromatographic data with matrix fluorescence detection, based on an Augmented PARAFAC model. The latter involves decomposition of a three-way data array augmented along the elution time mode with data for the calibration samples and for each of the test samples. A set of excitation-emission fluorescence matrices, measured at different chromatographic elution times for drinking water samples, containing three fluoroquinolones and uncalibrated interferences, were evaluated using this approach. Augmented PARAFAC exploits the second-order advantage, even in the presence of significant changes in chromatographic profiles from run to run. The obtained relative errors of prediction were ca. 10 % for ofloxacin, ciprofloxacin, and danofloxacin, with a significant enhancement in analytical figures of merit in comparison with previous reports. The results are compared with those furnished by MCR-ALS.

  16. Evaluation of intensity drift correction strategies using MetaboDrift, a normalization tool for multi-batch metabolomics data.

    PubMed

    Thonusin, Chanisa; IglayReger, Heidi B; Soni, Tanu; Rothberg, Amy E; Burant, Charles F; Evans, Charles R

    2017-11-10

    In recent years, mass spectrometry-based metabolomics has increasingly been applied to large-scale epidemiological studies of human subjects. However, the successful use of metabolomics in this context is subject to the challenge of detecting biologically significant effects despite substantial intensity drift that often occurs when data are acquired over a long period or in multiple batches. Numerous computational strategies and software tools have been developed to aid in correcting for intensity drift in metabolomics data, but most of these techniques are implemented using command-line driven software and custom scripts which are not accessible to all end users of metabolomics data. Further, it has not yet become routine practice to assess the quantitative accuracy of drift correction against techniques which enable true absolute quantitation such as isotope dilution mass spectrometry. We developed an Excel-based tool, MetaboDrift, to visually evaluate and correct for intensity drift in a multi-batch liquid chromatography - mass spectrometry (LC-MS) metabolomics dataset. The tool enables drift correction based on either quality control (QC) samples analyzed throughout the batches or using QC-sample independent methods. We applied MetaboDrift to an original set of clinical metabolomics data from a mixed-meal tolerance test (MMTT). The performance of the method was evaluated for multiple classes of metabolites by comparison with normalization using isotope-labeled internal standards. QC sample-based intensity drift correction significantly improved correlation with IS-normalized data, and resulted in detection of additional metabolites with significant physiological response to the MMTT. The relative merits of different QC-sample curve fitting strategies are discussed in the context of batch size and drift pattern complexity. Our drift correction tool offers a practical, simplified approach to drift correction and batch combination in large metabolomics studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Knowledge-based nonuniform sampling in multidimensional NMR.

    PubMed

    Schuyler, Adam D; Maciejewski, Mark W; Arthanari, Haribabu; Hoch, Jeffrey C

    2011-07-01

    The full resolution afforded by high-field magnets is rarely realized in the indirect dimensions of multidimensional NMR experiments because of the time cost of uniformly sampling to long evolution times. Emerging methods utilizing nonuniform sampling (NUS) enable high resolution along indirect dimensions by sampling long evolution times without sampling at every multiple of the Nyquist sampling interval. While the earliest NUS approaches matched the decay of sampling density to the decay of the signal envelope, recent approaches based on coupled evolution times attempt to optimize sampling by choosing projection angles that increase the likelihood of resolving closely-spaced resonances. These approaches employ knowledge about chemical shifts to predict optimal projection angles, whereas prior applications of tailored sampling employed only knowledge of the decay rate. In this work we adapt the matched filter approach as a general strategy for knowledge-based nonuniform sampling that can exploit prior knowledge about chemical shifts and is not restricted to sampling projections. Based on several measures of performance, we find that exponentially weighted random sampling (envelope matched sampling) performs better than shift-based sampling (beat matched sampling). While shift-based sampling can yield small advantages in sensitivity, the gains are generally outweighed by diminished robustness. Our observation that more robust sampling schemes are only slightly less sensitive than schemes highly optimized using prior knowledge about chemical shifts has broad implications for any multidimensional NMR study employing NUS. The results derived from simulated data are demonstrated with a sample application to PfPMT, the phosphoethanolamine methyltransferase of the human malaria parasite Plasmodium falciparum.

  18. [Identification of Systemic Contaminations with Legionella Spec. in Drinking Water Plumbing Systems: Sampling Strategies and Corresponding Parameters].

    PubMed

    Völker, S; Schreiber, C; Müller, H; Zacharias, N; Kistemann, T

    2017-05-01

    After the amendment of the Drinking Water Ordinance in 2011, the requirements for the hygienic-microbiological monitoring of drinking water installations have increased significantly. In the BMBF-funded project "Biofilm Management" (2010-2014), we examined the extent to which established sampling strategies in practice can uncover drinking water plumbing systems systemically colonized with Legionella. Moreover, we investigated additional parameters that might be suitable for detecting systemic contaminations. We subjected the drinking water plumbing systems of 8 buildings with known microbial contamination (Legionella) to an intensive hygienic-microbiological sampling with high spatial and temporal resolution. A total of 626 drinking hot water samples were analyzed with classical culture-based methods. In addition, comprehensive hygienic observations were conducted in each building and qualitative interviews with operators and users were applied. Collected tap-specific parameters were quantitatively analyzed by means of sensitivity and accuracy calculations. The systemic presence of Legionella in drinking water plumbing systems has a high spatial and temporal variability. Established sampling strategies were only partially suitable to detect long-term Legionella contaminations in practice. In particular, the sampling of hot water at the calorifier and circulation re-entrance showed little significance in terms of contamination events. To detect the systemic presence of Legionella,the parameters stagnation (qualitatively assessed) and temperature (compliance with the 5K-rule) showed better results. © Georg Thieme Verlag KG Stuttgart · New York.

  19. A systematic strategy for the identification and determination of pharmaceuticals in environment using advanced LC-MS tools: application to ground water samples.

    PubMed

    Jindal, Kriti; Narayanam, Mallikarjun; Singh, Saranjit

    2015-04-10

    In the present study, a novel analytical strategy was employed to study the occurrence of 40 drug residues belonging to different medicinal classes, e.g., antibiotics, β blockers, NSAIDs, antidiabetics, proton pump inhibitors, H2 receptor antagonists, antihypertensives, antihyperlipidemics, etc. in ground water samples collected from villages adjoining to S.A.S. Nagar, Punjab, India. The drug residues were extracted from the samples using solid-phase extraction, and LC-ESI-HRMS and LC-ESI-MS/MS were used for identification and quantitation of the analytes. Initially, qualifier and quantifier MRM transitions were classified for 40 targeted drugs, followed by development of LC-MS methods for the separation of all the drugs, which were divided into three categories to curtail overlapping of peaks. Overall identification was done through matching of retention times and MRM transitions; matching of intensity ratio of qualifier to quantifier transitions; comparison of base peak MS/MS profiles; and evaluation of isotopic abundances (wherever applicable). Final confirmation was carried out through comparison of accurate masses obtained from HRMS studies for both standard and targeted analytes in the samples. The application of the strategy allowed removal of false positives and helped in identification and quantitation of diclofenac in the ground water samples of four villages, and pitavastatin in a sample of one village. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Limited sampling strategy models for estimating the AUC of gliclazide in Chinese healthy volunteers.

    PubMed

    Huang, Ji-Han; Wang, Kun; Huang, Xiao-Hui; He, Ying-Chun; Li, Lu-Jin; Sheng, Yu-Cheng; Yang, Juan; Zheng, Qing-Shan

    2013-06-01

    The aim of this work is to reduce the cost of required sampling for the estimation of the area under the gliclazide plasma concentration versus time curve within 60 h (AUC0-60t ). The limited sampling strategy (LSS) models were established and validated by the multiple regression model within 4 or fewer gliclazide concentration values. Absolute prediction error (APE), root of mean square error (RMSE) and visual prediction check were used as criterion. The results of Jack-Knife validation showed that 10 (25.0 %) of the 40 LSS based on the regression analysis were not within an APE of 15 % using one concentration-time point. 90.2, 91.5 and 92.4 % of the 40 LSS models were capable of prediction using 2, 3 and 4 points, respectively. Limited sampling strategies were developed and validated for estimating AUC0-60t of gliclazide. This study indicates that the implementation of an 80 mg dosage regimen enabled accurate predictions of AUC0-60t by the LSS model. This study shows that 12, 6, 4, 2 h after administration are the key sampling times. The combination of (12, 2 h), (12, 8, 2 h) or (12, 8, 4, 2 h) can be chosen as sampling hours for predicting AUC0-60t in practical application according to requirement.

  1. Integrated Sampling Strategy (ISS) Guide

    Treesearch

    Robert E. Keane; Duncan C. Lutes

    2006-01-01

    What is an Integrated Sampling Strategy? Simply put, it is the strategy that guides how plots are put on the landscape. FIREMON’s Integrated Sampling Strategy assists fire managers as they design their fire monitoring project by answering questions such as: What statistical approach is appropriate for my sample design? How many plots can I afford? How many plots do I...

  2. Irregular and adaptive sampling for automatic geophysic measure systems

    NASA Astrophysics Data System (ADS)

    Avagnina, Davide; Lo Presti, Letizia; Mulassano, Paolo

    2000-07-01

    In this paper a sampling method, based on an irregular and adaptive strategy, is described. It can be used as automatic guide for rovers designed to explore terrestrial and planetary environments. Starting from the hypothesis that a explorative vehicle is equipped with a payload able to acquire measurements of interesting quantities, the method is able to detect objects of interest from measured points and to realize an adaptive sampling, while badly describing the not interesting background.

  3. Simultaneous escaping of explicit and hidden free energy barriers: application of the orthogonal space random walk strategy in generalized ensemble based conformational sampling.

    PubMed

    Zheng, Lianqing; Chen, Mengen; Yang, Wei

    2009-06-21

    To overcome the pseudoergodicity problem, conformational sampling can be accelerated via generalized ensemble methods, e.g., through the realization of random walks along prechosen collective variables, such as spatial order parameters, energy scaling parameters, or even system temperatures or pressures, etc. As usually observed, in generalized ensemble simulations, hidden barriers are likely to exist in the space perpendicular to the collective variable direction and these residual free energy barriers could greatly abolish the sampling efficiency. This sampling issue is particularly severe when the collective variable is defined in a low-dimension subset of the target system; then the "Hamiltonian lagging" problem, which reveals the fact that necessary structural relaxation falls behind the move of the collective variable, may be likely to occur. To overcome this problem in equilibrium conformational sampling, we adopted the orthogonal space random walk (OSRW) strategy, which was originally developed in the context of free energy simulation [L. Zheng, M. Chen, and W. Yang, Proc. Natl. Acad. Sci. U.S.A. 105, 20227 (2008)]. Thereby, generalized ensemble simulations can simultaneously escape both the explicit barriers along the collective variable direction and the hidden barriers that are strongly coupled with the collective variable move. As demonstrated in our model studies, the present OSRW based generalized ensemble treatments show improved sampling capability over the corresponding classical generalized ensemble treatments.

  4. A strategy for simultaneous determination of fatty acid composition, fatty acid position, and position-specific isotope contents in triacylglycerol matrices by 13C-NMR.

    PubMed

    Merchak, Noelle; Silvestre, Virginie; Loquet, Denis; Rizk, Toufic; Akoka, Serge; Bejjani, Joseph

    2017-01-01

    Triacylglycerols, which are quasi-universal components of food matrices, consist of complex mixtures of molecules. Their site-specific 13 C content, their fatty acid profile, and their position on the glycerol moiety may significantly vary with the geographical, botanical, or animal origin of the sample. Such variables are valuable tracers for food authentication issues. The main objective of this work was to develop a new method based on a rapid and precise 13 C-NMR spectroscopy (using a polarization transfer technique) coupled with multivariate linear regression analyses in order to quantify the whole set of individual fatty acids within triacylglycerols. In this respect, olive oil samples were analyzed by means of both adiabatic 13 C-INEPT sequence and gas chromatography (GC). For each fatty acid within the studied matrix and for squalene as well, a multivariate prediction model was constructed using the deconvoluted peak areas of 13 C-INEPT spectra as predictors, and the data obtained by GC as response variables. This 13 C-NMR-based strategy, tested on olive oil, could serve as an alternative to the gas chromatographic quantification of individual fatty acids in other matrices, while providing additional compositional and isotopic information. Graphical abstract A strategy based on the multivariate linear regression of variables obtained by a rapid 13 C-NMR technique was developed for the quantification of individual fatty acids within triacylglycerol matrices. The conceived strategy was tested on olive oil.

  5. The Role of Personal Epistemology in the Self-Regulation of Internet-Based Learning

    ERIC Educational Resources Information Center

    Stromso, Helge I.; Braten, Ivar

    2010-01-01

    The predictability of Internet-specific epistemic beliefs for self-regulated learning within Internet technologies was examined in a sample of 84 physics undergraduates. Dimensions of Internet-specific epistemic beliefs were found to explain unique variance in Internet-based search, help-seeking, and self-regulatory strategies, respectively.…

  6. Using a Sampling Strategy to Address Psychometric Challenges in Tutorial-Based Assessments

    ERIC Educational Resources Information Center

    Eva, Kevin W.; Solomon, Patty; Neville, Alan J.; Ladouceur, Michael; Kaufman, Karyn; Walsh, Allyn; Norman, Geoffrey R.

    2007-01-01

    Introduction: Tutorial-based assessment, despite providing a good match with the philosophy adopted by educational programmes that emphasize small group learning, remains one of the greatest challenges for educators working in this context. The current study was performed in an attempt to assess the psychometric characteristics of tutorial-based…

  7. Quality of artemisinin-based combination formulations for malaria treatment: prevalence and risk factors for poor quality medicines in public facilities and private sector drug outlets in Enugu, Nigeria.

    PubMed

    Kaur, Harparkash; Allan, Elizabeth Louise; Mamadu, Ibrahim; Hall, Zoe; Ibe, Ogochukwu; El Sherbiny, Mohamed; van Wyk, Albert; Yeung, Shunmay; Swamidoss, Isabel; Green, Michael D; Dwivedi, Prabha; Culzoni, Maria Julia; Clarke, Siân; Schellenberg, David; Fernández, Facundo M; Onwujekwe, Obinna

    2015-01-01

    Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. ACAs were purchased using three sampling approaches--convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained.

  8. Online Doppler Effect Elimination Based on Unequal Time Interval Sampling for Wayside Acoustic Bearing Fault Detecting System

    PubMed Central

    Ouyang, Kesai; Lu, Siliang; Zhang, Shangbin; Zhang, Haibin; He, Qingbo; Kong, Fanrang

    2015-01-01

    The railway occupies a fairly important position in transportation due to its high speed and strong transportation capability. As a consequence, it is a key issue to guarantee continuous running and transportation safety of trains. Meanwhile, time consumption of the diagnosis procedure is of extreme importance for the detecting system. However, most of the current adopted techniques in the wayside acoustic defective bearing detector system (ADBD) are offline strategies, which means that the signal is analyzed after the sampling process. This would result in unavoidable time latency. Besides, the acquired acoustic signal would be corrupted by the Doppler effect because of high relative speed between the train and the data acquisition system (DAS). Thus, it is difficult to effectively diagnose the bearing defects immediately. In this paper, a new strategy called online Doppler effect elimination (ODEE) is proposed to remove the Doppler distortion online by the introduced unequal interval sampling scheme. The steps of proposed strategy are as follows: The essential parameters are acquired in advance. Then, the introduced unequal time interval sampling strategy is used to restore the Doppler distortion signal, and the amplitude of the signal is demodulated as well. Thus, the restored Doppler-free signal is obtained online. The proposed ODEE method has been employed in simulation analysis. Ultimately, the ODEE method is implemented in the embedded system for fault diagnosis of the train bearing. The results are in good accordance with the bearing defects, which verifies the good performance of the proposed strategy. PMID:26343657

  9. Low-dose 4D cardiac imaging in small animals using dual source micro-CT

    NASA Astrophysics Data System (ADS)

    Holbrook, M.; Clark, D. P.; Badea, C. T.

    2018-01-01

    Micro-CT is widely used in preclinical studies, generating substantial interest in extending its capabilities in functional imaging applications such as blood perfusion and cardiac function. However, imaging cardiac structure and function in mice is challenging due to their small size and rapid heart rate. To overcome these challenges, we propose and compare improvements on two strategies for cardiac gating in dual-source, preclinical micro-CT: fast prospective gating (PG) and uncorrelated retrospective gating (RG). These sampling strategies combined with a sophisticated iterative image reconstruction algorithm provide faster acquisitions and high image quality in low-dose 4D (i.e. 3D  +  Time) cardiac micro-CT. Fast PG is performed under continuous subject rotation which results in interleaved projection angles between cardiac phases. Thus, fast PG provides a well-sampled temporal average image for use as a prior in iterative reconstruction. Uncorrelated RG incorporates random delays during sampling to prevent correlations between heart rate and sampling rate. We have performed both simulations and animal studies to validate these new sampling protocols. Sampling times for 1000 projections using fast PG and RG were 2 and 3 min, respectively, and the total dose was 170 mGy each. Reconstructions were performed using a 4D iterative reconstruction technique based on the split Bregman method. To examine undersampling robustness, subsets of 500 and 250 projections were also used for reconstruction. Both sampling strategies in conjunction with our iterative reconstruction method are capable of resolving cardiac phases and provide high image quality. In general, for equal numbers of projections, fast PG shows fewer errors than RG and is more robust to undersampling. Our results indicate that only 1000-projection based reconstruction with fast PG satisfies a 5% error criterion in left ventricular volume estimation. These methods promise low-dose imaging with a wide range of preclinical applications in cardiac imaging.

  10. DEVELOPMENT OF A STRATEGY FOR SAMPLING TREE RINGS

    EPA Science Inventory

    A method for determining retrospective pollution levels has been investigated. This method relates arsenic concentration in tree rings to arsenic-in-air concentrations based qualitatively on arsenic emissions from a nearby smelter, corrected for climatological and meteorological ...

  11. A plasmonic colorimetric strategy for visual miRNA detection based on hybridization chain reaction

    NASA Astrophysics Data System (ADS)

    Miao, Jie; Wang, Jingsheng; Guo, Jinyang; Gao, Huiguang; Han, Kun; Jiang, Chengmin; Miao, Peng

    2016-08-01

    In this work, a novel colorimetric strategy for miRNA analysis is proposed based on hybridization chain reaction (HCR)-mediated localized surface plasmon resonance (LSPR) variation of silver nanoparticles (AgNPs). miRNA in the sample to be tested is able to release HCR initiator from a solid interface to AgNPs colloid system by toehold exchange-mediated strand displacement, which then triggers the consumption of fuel strands with single-stranded tails for HCR. The final produced long nicked double-stranded DNA loses the ability to protect AgNPs from salt-induced aggregation. The stability variation of the colloid system can then be monitored by recording corresponding UV-vis spectrum and initial miRNA level is thus determined. This sensing system involves only four DNA strands which is quite simple. The practical utility is confirmed to be excellent by employing different biological samples.

  12. Widefield fluorescence microscopy with sensor-based conjugate adaptive optics using oblique back illumination

    PubMed Central

    Li, Jiang; Bifano, Thomas G.; Mertz, Jerome

    2016-01-01

    Abstract. We describe a wavefront sensor strategy for the implementation of adaptive optics (AO) in microscope applications involving thick, scattering media. The strategy is based on the exploitation of multiple scattering to provide oblique back illumination of the wavefront-sensor focal plane, enabling a simple and direct measurement of the flux-density tilt angles caused by aberrations at this plane. Advantages of the sensor are that it provides a large measurement field of view (FOV) while requiring no guide star, making it particularly adapted to a type of AO called conjugate AO, which provides a large correction FOV in cases when sample-induced aberrations arise from a single dominant plane (e.g., the sample surface). We apply conjugate AO here to widefield (i.e., nonscanning) fluorescence microscopy for the first time and demonstrate dynamic wavefront correction in a closed-loop implementation. PMID:27653793

  13. A Systematic Review of Promising Strategies of Faith-Based Cancer Education and Lifestyle Interventions Among Racial/Ethnic Minority Groups.

    PubMed

    Hou, Su-I; Cao, Xian

    2017-09-13

    Church-based interventions have been used to reach racial/ethnic minorities. In order to develop effective programs, we conducted a comprehensive systematic review of faith-based cancer prevention studies (2005~2016) to examine characteristics and promising strategies. Combination terms "church or faith-based or religion," "intervention or program," and "cancer education or lifestyle" were used in searching the five major databases: CINAHL; ERIC; Health Technology Assessments; MEDLINE; and PsycInfo. A total of 20 studies met study criteria. CDC's Community Guide was used to analyze and review group interventions. Analyses were organized by two racial groups: African American (AA) and Latino/Hispanic American groups. Results showed most studies reviewed focused on breast cancer alone or in combination with other cancers. Studies of Latino/Hispanic groups targeted more on uninsured, Medicare, or Medicaid individuals, whereas AA studies generally did not include specific insurance criteria. The sample sizes of the AA studies were generally larger. The majority of these studies reviewed used pre-post, posttest only with control group, or quasi-experience designs. The Health Belief Model was the most commonly used theory in both groups. Community-based participatory research and empowerment/ecological frameworks were also used frequently in the Latino/Hispanic studies. Small media and group education were the top two most popular intervention strategies in both groups. Although one-on-one strategy was used in some Latino studies, neither group used reducing client out-of-pocket costs strategy. Client reminders could also be used more in both groups as well. Current review showed church-based cancer education programs were effective in changing knowledge, but not always screening utilization. Results show faith-based cancer educational interventions are promising. To maximize intervention impact, future studies might consider using stronger study designs, incorporating a variety of proven effective strategies, including those frequently used evidence-based strategies, as well as exploring promising strategies among specific target groups.

  14. An optimal sample data usage strategy to minimize overfitting and underfitting effects in regression tree models based on remotely-sensed data

    USGS Publications Warehouse

    Gu, Yingxin; Wylie, Bruce K.; Boyte, Stephen; Picotte, Joshua J.; Howard, Danny; Smith, Kelcy; Nelson, Kurtis

    2016-01-01

    Regression tree models have been widely used for remote sensing-based ecosystem mapping. Improper use of the sample data (model training and testing data) may cause overfitting and underfitting effects in the model. The goal of this study is to develop an optimal sampling data usage strategy for any dataset and identify an appropriate number of rules in the regression tree model that will improve its accuracy and robustness. Landsat 8 data and Moderate-Resolution Imaging Spectroradiometer-scaled Normalized Difference Vegetation Index (NDVI) were used to develop regression tree models. A Python procedure was designed to generate random replications of model parameter options across a range of model development data sizes and rule number constraints. The mean absolute difference (MAD) between the predicted and actual NDVI (scaled NDVI, value from 0–200) and its variability across the different randomized replications were calculated to assess the accuracy and stability of the models. In our case study, a six-rule regression tree model developed from 80% of the sample data had the lowest MAD (MADtraining = 2.5 and MADtesting = 2.4), which was suggested as the optimal model. This study demonstrates how the training data and rule number selections impact model accuracy and provides important guidance for future remote-sensing-based ecosystem modeling.

  15. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  16. Developing trading strategies based on fractal finance: An application of MF-DFA in the context of Islamic equities

    NASA Astrophysics Data System (ADS)

    Dewandaru, Ginanjar; Masih, Rumi; Bacha, Obiyathulla Ismath; Masih, A. Mansur. M.

    2015-11-01

    We provide a new contribution to trading strategies by using multi-fractal de-trended fluctuation analysis (MF-DFA), imported from econophysics, to complement various momentum strategies. The method provides a single measure that can capture both persistency and anti-persistency in stock prices, accounting for multifractality. This study uses a sample of Islamic stocks listed in the U.S. Dow Jones Islamic market for a sample period covering 16 years starting in 1996. The findings show that the MF-DFA strategy produces monthly excess returns of 6.12%, outperforming other various momentum strategies. Even though the risk of the MF-DFA strategy may be relatively higher, it can still produce a Sharpe ratio of 0.164, which is substantially higher than that of the other strategies. When we control for the MF-DFA factor with the other factors, its pure factor return is still able to yield a monthly excess return of 1.35%. Finally, we combine the momentum and MF-DFA strategies, with the proportions of 90/10, 80/20, and 70/30 and by doing so we demonstrate that the MF-DFA measure can boost the total monthly excess returns as well as Sharpe ratio. The value added is non-linear which implies that the additional returns are associated with lower incremental risk.

  17. To mix or not to mix venous blood samples collected in vacuum tubes?

    PubMed

    Parenmark, Anna; Landberg, Eva

    2011-09-08

    There are recommendations to mix venous blood samples by inverting the tubes immediately after venipuncture. Though mixing allows efficient anticoagulation in plasma tubes and fast initiation of coagulation in serum tubes, the effect on laboratory analyses and risk of haemolysis has not been thoroughly evaluated. Venous blood samples were collected by venipuncture in vacuum tubes from 50 patients (10 or 20 patients in each group). Four types of tubes and 18 parameters used in routine clinical chemistry were evaluated. For each patient and tube, three types of mixing strategies were used: instant mixing, no mixing and 5 min of rest followed by mixing. Most analyses did not differ significantly in samples admitted to different mixing strategies. Plasma lactate dehydrogenase and haemolysis index showed a small but significant increase in samples omitted to instant mixing compared to samples without mixing. However, in one out of twenty non-mixed samples, activated partial thromboplastin time was seriously affected. These results indicate that mixing blood samples after venipuncture is not mandatory for all types of tubes. Instant mixing may introduce interference for those analyses susceptible to haemolysis. However, tubes with liquid-based citrate buffer for coagulation testing should be mixed to avoid clotting.

  18. Air sampling workshop: October 24-25, 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-06-01

    A two-day workshop was held in October 1978 on air sampling strategies for the occupational environment. Strategies comprise the elements of implementing an air sampling program including deciding on the extent of sampling, selecting appropriate types of measurement, placing sampling instruments properly, and interpreting sample results correctly. All of these elements are vital in the reliable assessment of occupational exposures yet their coverage in the industrial hygiene literature is meager. Although keyed to a few introductory topics, the agenda was sufficiently informal to accommodate extemporaneous discussion on any subject related to sampling strategies. Questions raised during the workshop mirror themore » status of air sampling strategy as much as the factual information that was presented. It may be concluded from the discussion and questions that air sampling strategy is an elementary state and urgently needs concerted attention from the industrial hygiene profession.« less

  19. Development and pilot study of a marketing strategy for primary care/internet-based depression prevention intervention for adolescents (the CATCH-IT intervention).

    PubMed

    Van Voorhees, Benjamin W; Watson, Natalie; Bridges, John F P; Fogel, Joshua; Galas, Jill; Kramer, Clarke; Connery, Marc; McGill, Ann; Marko, Monika; Cardenas, Alonso; Landsback, Josephine; Dmochowska, Karoline; Kuwabara, Sachiko A; Ellis, Justin; Prochaska, Micah; Bell, Carl

    2010-01-01

    Adolescent depression is both common and burdensome, and while evidence-based strategies have been developed to prevent adolescent depression, participation in such interventions remains extremely low, with less than 3% of at-risk individuals participating. To promote participation in evidence-based preventive strategies, a rigorous marketing strategy is needed to translate research into practice. To develop and pilot a rigorous marketing strategy for engaging at-risk individuals with an Internet-based depression prevention intervention in primary care targeting key attitudes and beliefs. A marketing design group was constituted to develop a marketing strategy based on the principles of targeting, positioning/competitor analysis, decision analysis, and promotion/distribution and incorporating contemporary models of behavior change. We evaluated the formative quality of the intervention and observed the fielding experience for prevention using a pilot study (observational) design. The marketing plan focused on "resiliency building" rather than "depression intervention" and was relayed by office staff and the Internet site. Twelve practices successfully implemented the intervention and recruited a diverse sample of adolescents with > 30% of all those with positive screens and > 80% of those eligible after phone assessment enrolling in the study with a cost of $58 per enrollee. Adolescent motivation for depression prevention (1-10 scale) increased from a baseline mean value of 7.45 (SD = 2.05) to 8.07 poststudy (SD = 1.33) (P = .048). Marketing strategies for preventive interventions for mental disorders can be developed and successfully introduced and marketed in primary care.

  20. Development and Pilot Study of a Marketing Strategy for Primary Care/Internet–Based Depression Prevention Intervention for Adolescents (The CATCH-IT Intervention)

    PubMed Central

    Watson, Natalie; Bridges, John F. P.; Fogel, Joshua; Galas, Jill; Kramer, Clarke; Connery, Marc; McGill, Ann; Marko, Monika; Cardenas, Alonso; Landsback, Josephine; Dmochowska, Karoline; Kuwabara, Sachiko A.; Ellis, Justin; Prochaska, Micah; Bell, Carl

    2010-01-01

    Background: Adolescent depression is both common and burdensome, and while evidence-based strategies have been developed to prevent adolescent depression, participation in such interventions remains extremely low, with less than 3% of at-risk individuals participating. To promote participation in evidence-based preventive strategies, a rigorous marketing strategy is needed to translate research into practice. Objective: To develop and pilot a rigorous marketing strategy for engaging at-risk individuals with an Internet-based depression prevention intervention in primary care targeting key attitudes and beliefs. Method: A marketing design group was constituted to develop a marketing strategy based on the principles of targeting, positioning/competitor analysis, decision analysis, and promotion/distribution and incorporating contemporary models of behavior change. We evaluated the formative quality of the intervention and observed the fielding experience for prevention using a pilot study (observational) design. Results: The marketing plan focused on “resiliency building” rather than “depression intervention” and was relayed by office staff and the Internet site. Twelve practices successfully implemented the intervention and recruited a diverse sample of adolescents with > 30% of all those with positive screens and > 80% of those eligible after phone assessment enrolling in the study with a cost of $58 per enrollee. Adolescent motivation for depression prevention (1–10 scale) increased from a baseline mean value of 7.45 (SD = 2.05) to 8.07 poststudy (SD = 1.33) (P = .048). Conclusions: Marketing strategies for preventive interventions for mental disorders can be developed and successfully introduced and marketed in primary care. PMID:20944776

  1. Comparison of Depletion Strategies for the Enrichment of Low-Abundance Proteins in Urine.

    PubMed

    Filip, Szymon; Vougas, Konstantinos; Zoidakis, Jerome; Latosinska, Agnieszka; Mullen, William; Spasovski, Goce; Mischak, Harald; Vlahou, Antonia; Jankowski, Joachim

    2015-01-01

    Proteome analysis of complex biological samples for biomarker identification remains challenging, among others due to the extended range of protein concentrations. High-abundance proteins like albumin or IgG of plasma and urine, may interfere with the detection of potential disease biomarkers. Currently, several options are available for the depletion of abundant proteins in plasma. However, the applicability of these methods in urine has not been thoroughly investigated. In this study, we compared different, commercially available immunodepletion and ion-exchange based approaches on urine samples from both healthy subjects and CKD patients, for their reproducibility and efficiency in protein depletion. A starting urine volume of 500 μL was used to simulate conditions of a multi-institutional biomarker discovery study. All depletion approaches showed satisfactory reproducibility (n=5) in protein identification as well as protein abundance. Comparison of the depletion efficiency between the unfractionated and fractionated samples and the different depletion strategies, showed efficient depletion in all cases, with the exception of the ion-exchange kit. The depletion efficiency was found slightly higher in normal than in CKD samples and normal samples yielded more protein identifications than CKD samples when using both initial as well as corresponding depleted fractions. Along these lines, decrease in the amount of albumin and other targets as applicable, following depletion, was observed. Nevertheless, these depletion strategies did not yield a higher number of identifications in neither the urine from normal nor CKD patients. Collectively, when analyzing urine in the context of CKD biomarker identification, no added value of depletion strategies can be observed and analysis of unfractionated starting urine appears to be preferable.

  2. Comparison of Depletion Strategies for the Enrichment of Low-Abundance Proteins in Urine

    PubMed Central

    Filip, Szymon; Vougas, Konstantinos; Zoidakis, Jerome; Latosinska, Agnieszka; Mullen, William; Spasovski, Goce; Mischak, Harald; Vlahou, Antonia; Jankowski, Joachim

    2015-01-01

    Proteome analysis of complex biological samples for biomarker identification remains challenging, among others due to the extended range of protein concentrations. High-abundance proteins like albumin or IgG of plasma and urine, may interfere with the detection of potential disease biomarkers. Currently, several options are available for the depletion of abundant proteins in plasma. However, the applicability of these methods in urine has not been thoroughly investigated. In this study, we compared different, commercially available immunodepletion and ion-exchange based approaches on urine samples from both healthy subjects and CKD patients, for their reproducibility and efficiency in protein depletion. A starting urine volume of 500 μL was used to simulate conditions of a multi-institutional biomarker discovery study. All depletion approaches showed satisfactory reproducibility (n=5) in protein identification as well as protein abundance. Comparison of the depletion efficiency between the unfractionated and fractionated samples and the different depletion strategies, showed efficient depletion in all cases, with the exception of the ion-exchange kit. The depletion efficiency was found slightly higher in normal than in CKD samples and normal samples yielded more protein identifications than CKD samples when using both initial as well as corresponding depleted fractions. Along these lines, decrease in the amount of albumin and other targets as applicable, following depletion, was observed. Nevertheless, these depletion strategies did not yield a higher number of identifications in neither the urine from normal nor CKD patients. Collectively, when analyzing urine in the context of CKD biomarker identification, no added value of depletion strategies can be observed and analysis of unfractionated starting urine appears to be preferable. PMID:26208298

  3. Memory and communication support in dementia: research-based strategies for caregivers.

    PubMed

    Smith, Erin R; Broughton, Megan; Baker, Rosemary; Pachana, Nancy A; Angwin, Anthony J; Humphreys, Michael S; Mitchell, Leander; Byrne, Gerard J; Copland, David A; Gallois, Cindy; Hegney, Desley; Chenery, Helen J

    2011-03-01

    Difficulties with memory and communication are prominent and distressing features of dementia which impact on the person with dementia and contribute to caregiver stress and burden. There is a need to provide caregivers with strategies to support and maximize memory and communication abilities in people with dementia. In this project, a team of clinicians, researchers and educators in neuropsychology, psychogeriatrics, nursing and speech pathology translated research-based knowledge from these fields into a program of practical strategies for everyday use by family and professional caregivers. From the available research evidence, the project team identified compensatory or facilitative strategies to assist with common areas of difficulty, and structured these under the mnemonics RECAPS (for memory) and MESSAGE (for communication). This information was adapted for presentation in a DVD-based education program in accordance with known characteristics of effective caregiver education. The resultant DVD comprises (1) information on the nature and importance of memory and communication in everyday life; (2) explanations of common patterns of difficulty and preserved ability in memory and communication across the stages of dementia; (3) acted vignettes demonstrating the strategies, based on authentic samples of speech in dementia; and (4) scenarios to prompt the viewer to consider the benefits of using the strategies. Using a knowledge-translation framework, information and strategies can be provided to family and professional caregivers to help them optimize residual memory and communication in people with dementia. Future development of the materials, incorporating consumer feedback, will focus on methods for enabling wider dissemination.

  4. Detecting and Estimating Contamination of Human DNA Samples in Sequencing and Array-Based Genotype Data

    PubMed Central

    Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2012-01-01

    DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226

  5. Identification of winter wheat from ERTS-1 imagery

    NASA Technical Reports Server (NTRS)

    Williams, D. L.; Morain, S. A.; Barker, B.; Coiner, J. C.

    1973-01-01

    Continuing interpretation of the test area in Finney County, Kansas, has revealed that winter wheat can be successfully identified. This successful identification is based on human recognition of tonal signatures on MSS images. Several different but highly successful interpretation strategies have been employed. These strategies involve the use of both spectral and temporal inputs. Good results have been obtained from a single MSS-5 image acquired at a critical time in the crop cycle (planting). On a test sample of 54,612 acres, 89 percent of the acreage was correctly classified as wheat or non-wheat and the estimated wheat acreage (19,516 acres) was 99 percent of the actual acreage of wheat in the sample area.

  6. A satellite digital controller or 'play that PID tune again, Sam'. [Position, Integral, Derivative feedback control algorithm for design strategy

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1976-01-01

    The problem discussed is to design a digital controller for a typical satellite. The controlled plant is considered to be a rigid body acting in a plane. The controller is assumed to be a digital computer which, when combined with the proposed control algorithm, can be represented as a sampled-data system. The objective is to present a design strategy and technique for selecting numerical values for the control gains (assuming position, integral, and derivative feedback) and the sample rate. The technique is based on the parameter plane method and requires that the system be amenable to z-transform analysis.

  7. A method for cone fitting based on certain sampling strategy in CMM metrology

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Guo, Chaopeng

    2018-04-01

    A method of cone fitting in engineering is explored and implemented to overcome shortcomings of current fitting method. In the current method, the calculations of the initial geometric parameters are imprecise which cause poor accuracy in surface fitting. A geometric distance function of cone is constructed firstly, then certain sampling strategy is defined to calculate the initial geometric parameters, afterwards nonlinear least-squares method is used to fit the surface. The experiment is designed to verify accuracy of the method. The experiment data prove that the proposed method can get initial geometric parameters simply and efficiently, also fit the surface precisely, and provide a new accurate way to cone fitting in the coordinate measurement.

  8. Volatility measurement with directional change in Chinese stock market: Statistical property and investment strategy

    NASA Astrophysics Data System (ADS)

    Ma, Junjun; Xiong, Xiong; He, Feng; Zhang, Wei

    2017-04-01

    The stock price fluctuation is studied in this paper with intrinsic time perspective. The event, directional change (DC) or overshoot, are considered as time scale of price time series. With this directional change law, its corresponding statistical properties and parameter estimation is tested in Chinese stock market. Furthermore, a directional change trading strategy is proposed for invest in the market portfolio in Chinese stock market, and both in-sample and out-of-sample performance are compared among the different method of model parameter estimation. We conclude that DC method can capture important fluctuations in Chinese stock market and gain profit due to the statistical property that average upturn overshoot size is bigger than average downturn directional change size. The optimal parameter of DC method is not fixed and we obtained 1.8% annual excess return with this DC-based trading strategy.

  9. Database extraction strategies for low-template evidence.

    PubMed

    Bleka, Øyvind; Dørum, Guro; Haned, Hinda; Gill, Peter

    2014-03-01

    Often in forensic cases, the profile of at least one of the contributors to a DNA evidence sample is unknown and a database search is needed to discover possible perpetrators. In this article we consider two types of search strategies to extract suspects from a database using methods based on probability arguments. The performance of the proposed match scores is demonstrated by carrying out a study of each match score relative to the level of allele drop-out in the crime sample, simulating low-template DNA. The efficiency was measured by random man simulation and we compared the performance using the SGM Plus kit and the ESX 17 kit for the Norwegian population, demonstrating that the latter has greatly enhanced power to discover perpetrators of crime in large national DNA databases. The code for the database extraction strategies will be prepared for release in the R-package forensim. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Scientific Reasoning and Its Relationship with Problem Solving: The Case of Upper Primary Science Teachers

    ERIC Educational Resources Information Center

    Alshamali, Mahmoud A.; Daher, Wajeeh M.

    2016-01-01

    This study aimed at identifying the levels of scientific reasoning of upper primary stage (grades 4-7) science teachers based on their use of a problem-solving strategy. The study sample (N = 138; 32 % male and 68 % female) was randomly selected using stratified sampling from an original population of 437 upper primary school teachers. The…

  11. Surveillance strategies for Classical Swine Fever in wild boar – a comprehensive evaluation study to ensure powerful surveillance

    PubMed Central

    Schulz, Katja; Peyre, Marisa; Staubach, Christoph; Schauer, Birgit; Schulz, Jana; Calba, Clémentine; Häsler, Barbara; Conraths, Franz J.

    2017-01-01

    Surveillance of Classical Swine Fever (CSF) should not only focus on livestock, but must also include wild boar. To prevent disease transmission into commercial pig herds, it is therefore vital to have knowledge about the disease status in wild boar. In the present study, we performed a comprehensive evaluation of alternative surveillance strategies for Classical Swine Fever (CSF) in wild boar and compared them with the currently implemented conventional approach. The evaluation protocol was designed using the EVA tool, a decision support tool to help in the development of an economic and epidemiological evaluation protocol for surveillance. To evaluate the effectiveness of the surveillance strategies, we investigated their sensitivity and timeliness. Acceptability was analysed and finally, the cost-effectiveness of the surveillance strategies was determined. We developed 69 surveillance strategies for comparative evaluation between the existing approach and the novel proposed strategies. Sampling only within sub-adults resulted in a better acceptability and timeliness than the currently implemented strategy. Strategies that were completely based on passive surveillance performance did not achieve the desired detection probability of 95%. In conclusion, the results of the study suggest that risk-based approaches can be an option to design more effective CSF surveillance strategies in wild boar. PMID:28266576

  12. Surveillance strategies for Classical Swine Fever in wild boar - a comprehensive evaluation study to ensure powerful surveillance.

    PubMed

    Schulz, Katja; Peyre, Marisa; Staubach, Christoph; Schauer, Birgit; Schulz, Jana; Calba, Clémentine; Häsler, Barbara; Conraths, Franz J

    2017-03-07

    Surveillance of Classical Swine Fever (CSF) should not only focus on livestock, but must also include wild boar. To prevent disease transmission into commercial pig herds, it is therefore vital to have knowledge about the disease status in wild boar. In the present study, we performed a comprehensive evaluation of alternative surveillance strategies for Classical Swine Fever (CSF) in wild boar and compared them with the currently implemented conventional approach. The evaluation protocol was designed using the EVA tool, a decision support tool to help in the development of an economic and epidemiological evaluation protocol for surveillance. To evaluate the effectiveness of the surveillance strategies, we investigated their sensitivity and timeliness. Acceptability was analysed and finally, the cost-effectiveness of the surveillance strategies was determined. We developed 69 surveillance strategies for comparative evaluation between the existing approach and the novel proposed strategies. Sampling only within sub-adults resulted in a better acceptability and timeliness than the currently implemented strategy. Strategies that were completely based on passive surveillance performance did not achieve the desired detection probability of 95%. In conclusion, the results of the study suggest that risk-based approaches can be an option to design more effective CSF surveillance strategies in wild boar.

  13. Simulation-based training for nurses: Systematic review and meta-analysis.

    PubMed

    Hegland, Pål A; Aarlie, Hege; Strømme, Hilde; Jamtvedt, Gro

    2017-07-01

    Simulation-based training is a widespread strategy to improve health-care quality. However, its effect on registered nurses has previously not been established in systematic reviews. The aim of this systematic review is to evaluate effect of simulation-based training on nurses' skills and knowledge. We searched CDSR, DARE, HTA, CENTRAL, CINAHL, MEDLINE, Embase, ERIC, and SveMed+ for randomised controlled trials (RCT) evaluating effect of simulation-based training among nurses. Searches were completed in December 2016. Two reviewers independently screened abstracts and full-text, extracted data, and assessed risk of bias. We compared simulation-based training to other learning strategies, high-fidelity simulation to other simulation strategies, and different organisation of simulation training. Data were analysed through meta-analysis and narrative syntheses. GRADE was used to assess the quality of evidence. Fifteen RCTs met the inclusion criteria. For the comparison of simulation-based training to other learning strategies on nurses' skills, six studies in the meta-analysis showed a significant, but small effect in favour of simulation (SMD -1.09, CI -1.72 to -0.47). There was large heterogeneity (I 2 85%). For the other comparisons, there was large between-study variation in results. The quality of evidence for all comparisons was graded as low. The effect of simulation-based training varies substantially between studies. Our meta-analysis showed a significant effect of simulation training compared to other learning strategies, but the quality of evidence was low indicating uncertainty. Other comparisons showed inconsistency in results. Based on our findings simulation training appears to be an effective strategy to improve nurses' skills, but further good-quality RCTs with adequate sample sizes are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Discovering Watson's Crick in High School Chemistry.

    ERIC Educational Resources Information Center

    Whitman, Mark

    1984-01-01

    High school chemistry students begin the school year by reading and studying "The Double Helix" by James B. Watson. Rationale, objectives, and instructional strategies for this assignment are discussed. Sample discussion questions based on the book are included. (JN)

  15. Development and implementation of a Hope Intervention Program.

    PubMed

    Herth, K A

    2001-07-01

    To describe the development and evaluation of the Hope Intervention Program (HIP), designed to enhance hope, based on the Hope Process Framework. Descriptive and evaluative. Outpatient oncology clinics of two large hospitals. Convenience sample--38 adults with first recurrence of cancer. Questionnaire completed at end of last HIP session, three months, six months, and nine months. Helpfulness of the HIP components in maintaining hope. Participants used intervention strategies that represented all four attributes of hope, with specific strategy use varying across time and individuals. Data suggested that HIP positively affected the participants' rebuilding and maintenance of hope. HIP has the potential to positively influence hope, and it supports the strategic vital role that nurses play in implementing research-based strategies designed to engender hope in their patients and prevent hopelessness.

  16. Sampling strategies for estimating brook trout effective population size

    Treesearch

    Andrew R. Whiteley; Jason A. Coombs; Mark Hudy; Zachary Robinson; Keith H. Nislow; Benjamin H. Letcher

    2012-01-01

    The influence of sampling strategy on estimates of effective population size (Ne) from single-sample genetic methods has not been rigorously examined, though these methods are increasingly used. For headwater salmonids, spatially close kin association among age-0 individuals suggests that sampling strategy (number of individuals and location from...

  17. A ratiometric strategy -based electrochemical sensing interface for the sensitive and reliable detection of imidacloprid.

    PubMed

    Li, Xueyan; Kan, Xianwen

    2018-04-30

    In this study, a ratiometric strategy-based electrochemical sensor was developed by electropolymerization of thionine (THI) and β-cyclodextrin (β-CD) composite films on a glassy carbon electrode surface for imidacloprid (IMI) detection. THI played the role of an inner reference element to provide a built-in correction. In addition, the modified β-CD showed good selective enrichment for IMI to improve the sensitivity and anti-interference ability of the sensor. The current ratio between IMI and THI was calculated as the detected signal for IMI sensing. Compared with common single-signal sensing, the proposed ratiometric strategy showed a higher linear range and a lower limit of detection of 4.0 × 10-8-1.0 × 10-5 mol L-1 and 1.7 × 10-8 mol L-1, respectively, for IMI detection. On the other hand, the ratiometric strategy endowed the sensor with good accuracy, reproducibility, and stability. The sensor was also used for IMI determination in real samples with satisfactory results. The simple, effective, and reliable way reported in this study can be further used to prepare ratiometric strategy-based electrochemical sensors for the selective and sensitive detection of other compounds with good accuracy and stability.

  18. Comparison of different incubation conditions for microbiological environmental monitoring.

    PubMed

    Gordon, Oliver; Berchtold, Manfred; Staerk, Alexandra; Roesti, David

    2014-01-01

    Environmental monitoring represents an integral part of the microbiological quality control system of a pharmaceutical manufacturing operation. However, guidance documents differ regarding recommendation of a procedure, particularly regarding incubation time, incubation temperature, or nutrient media. Because of these discrepancies, many manufacturers decide for a particular environmental monitoring sample incubation strategy and support this decision with validation data. Such validations are typically laboratory-based in vitro studies, meaning that these are based on comparing incubation conditions and nutrient media through use of cultured microorganisms. An informal survey of the results of these in vitro studies performed at Novartis or European manufacturing sites of different pharmaceutical companies highlighted that no consensus regarding the optimal incubation conditions for microbial recovery existed. To address this question differently, we collected a significant amount of samples directly from air, inanimate surfaces, and personnel in pharmaceutical production and packaging rooms during manufacturing operation (in situ study). Samples were incubated under different conditions suggested in regulatory guidelines, and recovery of total aerobic microorganisms as well as moulds was assessed. We found the highest recovery of total aerobic count from areas with personnel flow using a general microbiological growth medium incubated at 30-35 °C. The highest recovery of moulds was obtained with mycological medium incubated at 20-25 °C. Single-plate strategies (two-temperature incubation or an intermediate incubation temperature of 25-30 °C) also yielded reasonable recovery of total aerobic count and moulds. However, recovery of moulds was found to be highly inefficient at 30-35 °C compared to lower incubation temperatures. This deficiency could not be rectified by subsequent incubation at 20-25 °C. A laboratory-based in vitro study performed in parallel was inconclusive. We consider our results potentially conferrable to other pharmaceutical manufacturing sites in moderate climate zones and believe that these should represent a valuable reference for definition of the incubation strategy of microbiological environmental monitoring samples. Microbiological environmental monitoring confirms that pharmaceutical cleanrooms are in an appropriate hygienic condition for manufacturing of drug products. Guidance documents from different health authorities or expert groups differ regarding recommendation of the applied incubation time, incubation temperature, or nutrient media. Therefore, many pharmaceutical manufacturers perform studies that aim to identify the optimal incubation setup for environmental monitoring samples. An informal survey of the results of such studies, which had been performed at Novartis or European manufacturing sites of different pharmaceutical companies, highlighted no consensus regarding the optimal incubation conditions for microbial recovery. All these studies had been conducted in the laboratory using selections of cultured microbial strains. We tried to solve this disagreement by collecting a significant amount of real environmental monitoring samples directly from the environment in pharmaceutical production and packaging rooms during manufacturing operation. These samples were then incubated under different conditions suggested in the regulatory guidelines. We believe that the results of our study are more meaningful than laboratory-based experiments because we used environmental samples with microorganisms directly isolated from the manufacturing area. Therefore, we believe that our results should represent a valuable reference for definition of the incubation strategy of microbiological environmental monitoring samples. © PDA, Inc. 2014.

  19. An integrative machine learning strategy for improved prediction of essential genes in Escherichia coli metabolism using flux-coupled features.

    PubMed

    Nandi, Sutanu; Subramanian, Abhishek; Sarkar, Ram Rup

    2017-07-25

    Prediction of essential genes helps to identify a minimal set of genes that are absolutely required for the appropriate functioning and survival of a cell. The available machine learning techniques for essential gene prediction have inherent problems, like imbalanced provision of training datasets, biased choice of the best model for a given balanced dataset, choice of a complex machine learning algorithm, and data-based automated selection of biologically relevant features for classification. Here, we propose a simple support vector machine-based learning strategy for the prediction of essential genes in Escherichia coli K-12 MG1655 metabolism that integrates a non-conventional combination of an appropriate sample balanced training set, a unique organism-specific genotype, phenotype attributes that characterize essential genes, and optimal parameters of the learning algorithm to generate the best machine learning model (the model with the highest accuracy among all the models trained for different sample training sets). For the first time, we also introduce flux-coupled metabolic subnetwork-based features for enhancing the classification performance. Our strategy proves to be superior as compared to previous SVM-based strategies in obtaining a biologically relevant classification of genes with high sensitivity and specificity. This methodology was also trained with datasets of other recent supervised classification techniques for essential gene classification and tested using reported test datasets. The testing accuracy was always high as compared to the known techniques, proving that our method outperforms known methods. Observations from our study indicate that essential genes are conserved among homologous bacterial species, demonstrate high codon usage bias, GC content and gene expression, and predominantly possess a tendency to form physiological flux modules in metabolism.

  20. Serum markers for type II diabetes mellitus

    DOEpatents

    Metz, Thomas O; Qian, Wei-Jun; Jacobs, Jon M; Polpitiya, Ashoka D; Camp, II, David G; Smith, Richard D

    2014-03-18

    A method for identifying persons with increased risk of developing type 2 diabetes mellitus utilizing selected biomarkers described hereafter either alone or in combination. The present invention allows for broad based, reliable, screening of large population bases and provides other advantages, including the formulation of effective strategies for characterizing, archiving, and contrasting data from multiple sample types under varying conditions.

  1. Reported Prevalence of Evidence-Based Instructional Practices by Special Educators in the Czech Republic

    ERIC Educational Resources Information Center

    Carter, Mark; Strnadova, Iva; Stephenson, Jennifer

    2012-01-01

    The reported level of use of eight instructional strategies in a sample of 531 special educators in the Czech Republic was examined in this study. Consistent with recent parallel studies in North America and Australia, the respondents reported that they used a combination of evidence-based instructional practices (such as direct instruction and…

  2. Relationships between Students' Strategies for Influencing Their Study Environment and Their Strategic Approach to Studying

    ERIC Educational Resources Information Center

    Jungert, Tomas; Rosander, Michael

    2009-01-01

    The purpose of this study was to investigate the relationship between student influence, students' strategic approaches to studying and academic achievement, and to examine differences between students in a Master's programme in Engineering with conventional teaching and one based on problem-based learning in a sample of 268 students. A version of…

  3. Investigating the Effectiveness of a POE-Based Teaching Activity on Students' Understanding of Condensation

    ERIC Educational Resources Information Center

    Costu, Bayram; Ayas, Alipasa; Niaz, Mansoor

    2012-01-01

    This article reports on the development of a Predict-Observe-Explain, POE-based teaching strategy to facilitate conceptual change and its effectiveness on student understanding of condensation. The sample consisted of 52 first-year students in primary science education department. Students' ideas were elicited using a test consisting of five probe…

  4. Challenges to be overcome using population-based sampling methods to recruit veterans for a study of post-traumatic stress disorder and traumatic brain injury.

    PubMed

    Bayley, Peter J; Kong, Jennifer Y; Helmer, Drew A; Schneiderman, Aaron; Roselli, Lauren A; Rosse, Stephanie M; Jackson, Jordan A; Baldwin, Janet; Isaac, Linda; Nolasco, Michael; Blackman, Marc R; Reinhard, Matthew J; Ashford, John Wesson; Chapman, Julie C

    2014-04-08

    Many investigators are interested in recruiting veterans from recent conflicts in Afghanistan and Iraq with Traumatic Brain Injury (TBI) and/or Post Traumatic Stress Disorder (PTSD). Researchers pursuing such studies may experience problems in recruiting sufficient numbers unless effective strategies are used. Currently, there is very little information on recruitment strategies for individuals with TBI and/or PTSD. It is known that groups of patients with medical conditions may be less likely to volunteer for clinical research. This study investigated the feasibility of recruiting veterans returning from recent military conflicts--Operation Enduring Freedom (OEF) and Operation Iraqi Freedom (OIF)--using a population-based sampling method. Individuals were sampled from a previous epidemiological study. Three study sites focused on recruiting survey respondents (n = 445) who lived within a 60 mile radius of one of the sites. Overall, the successful recruitment of veterans using a population-based sampling method was dependent on the ability to contact potential participants following mass mailing. Study enrollment of participants with probable TBI and/or PTSD had a recruitment yield (enrolled/total identified) of 5.4%. We were able to contact 146 individuals, representing a contact rate of 33%. Sixty-six of the individuals contacted were screened. The major reasons for not screening included a stated lack of interest in the study (n = 37), a failure to answer screening calls after initial contact (n = 30), and an unwillingness or inability to travel to a study site (n = 10). Based on the phone screening, 36 veterans were eligible for the study. Twenty-four veterans were enrolled, (recruitment yield = 5.4%) and twelve were not enrolled for a variety of reasons. Our experience with a population-based sampling method for recruitment of recent combat veterans illustrates the challenges encountered, particularly contacting and screening potential participants. The screening and enrollment data will help guide recruitment for future studies using population-based methods.

  5. Assessment of Sample Preparation Bias in Mass Spectrometry-Based Proteomics.

    PubMed

    Klont, Frank; Bras, Linda; Wolters, Justina C; Ongay, Sara; Bischoff, Rainer; Halmos, Gyorgy B; Horvatovich, Péter

    2018-04-17

    For mass spectrometry-based proteomics, the selected sample preparation strategy is a key determinant for information that will be obtained. However, the corresponding selection is often not based on a fit-for-purpose evaluation. Here we report a comparison of in-gel (IGD), in-solution (ISD), on-filter (OFD), and on-pellet digestion (OPD) workflows on the basis of targeted (QconCAT-multiple reaction monitoring (MRM) method for mitochondrial proteins) and discovery proteomics (data-dependent acquisition, DDA) analyses using three different human head and neck tissues (i.e., nasal polyps, parotid gland, and palatine tonsils). Our study reveals differences between the sample preparation methods, for example, with respect to protein and peptide losses, quantification variability, protocol-induced methionine oxidation, and asparagine/glutamine deamidation as well as identification of cysteine-containing peptides. However, none of the methods performed best for all types of tissues, which argues against the existence of a universal sample preparation method for proteome analysis.

  6. Ensemble Grouping Strategies for Embedded Stochastic Collocation Methods Applied to Anisotropic Diffusion Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Elia, M.; Edwards, H. C.; Hu, J.

    Previous work has demonstrated that propagating groups of samples, called ensembles, together through forward simulations can dramatically reduce the aggregate cost of sampling-based uncertainty propagation methods [E. Phipps, M. D'Elia, H. C. Edwards, M. Hoemmen, J. Hu, and S. Rajamanickam, SIAM J. Sci. Comput., 39 (2017), pp. C162--C193]. However, critical to the success of this approach when applied to challenging problems of scientific interest is the grouping of samples into ensembles to minimize the total computational work. For example, the total number of linear solver iterations for ensemble systems may be strongly influenced by which samples form the ensemble whenmore » applying iterative linear solvers to parameterized and stochastic linear systems. In this paper we explore sample grouping strategies for local adaptive stochastic collocation methods applied to PDEs with uncertain input data, in particular canonical anisotropic diffusion problems where the diffusion coefficient is modeled by truncated Karhunen--Loève expansions. Finally, we demonstrate that a measure of the total anisotropy of the diffusion coefficient is a good surrogate for the number of linear solver iterations for each sample and therefore provides a simple and effective metric for grouping samples.« less

  7. Ensemble Grouping Strategies for Embedded Stochastic Collocation Methods Applied to Anisotropic Diffusion Problems

    DOE PAGES

    D'Elia, M.; Edwards, H. C.; Hu, J.; ...

    2018-01-18

    Previous work has demonstrated that propagating groups of samples, called ensembles, together through forward simulations can dramatically reduce the aggregate cost of sampling-based uncertainty propagation methods [E. Phipps, M. D'Elia, H. C. Edwards, M. Hoemmen, J. Hu, and S. Rajamanickam, SIAM J. Sci. Comput., 39 (2017), pp. C162--C193]. However, critical to the success of this approach when applied to challenging problems of scientific interest is the grouping of samples into ensembles to minimize the total computational work. For example, the total number of linear solver iterations for ensemble systems may be strongly influenced by which samples form the ensemble whenmore » applying iterative linear solvers to parameterized and stochastic linear systems. In this paper we explore sample grouping strategies for local adaptive stochastic collocation methods applied to PDEs with uncertain input data, in particular canonical anisotropic diffusion problems where the diffusion coefficient is modeled by truncated Karhunen--Loève expansions. Finally, we demonstrate that a measure of the total anisotropy of the diffusion coefficient is a good surrogate for the number of linear solver iterations for each sample and therefore provides a simple and effective metric for grouping samples.« less

  8. Measuring strategies for learning regulation in medical education: scale reliability and dimensionality in a Swedish sample.

    PubMed

    Edelbring, Samuel

    2012-08-15

    The degree of learners' self-regulated learning and dependence on external regulation influence learning processes in higher education. These regulation strategies are commonly measured by questionnaires developed in other settings than in which they are being used, thereby requiring renewed validation. The aim of this study was to psychometrically evaluate the learning regulation strategy scales from the Inventory of Learning Styles with Swedish medical students (N = 206). The regulation scales were evaluated regarding their reliability, scale dimensionality and interrelations. The primary evaluation focused on dimensionality and was performed with Mokken scale analysis. To assist future scale refinement, additional item analysis, such as item-to-scale correlations, was performed. Scale scores in the Swedish sample displayed good reliability in relation to published results: Cronbach's alpha: 0.82, 0.72, and 0.65 for self-regulation, external regulation and lack of regulation scales respectively. The dimensionalities in scales were adequate for self-regulation and its subscales, whereas external regulation and lack of regulation displayed less unidimensionality. The established theoretical scales were largely replicated in the exploratory analysis. The item analysis identified two items that contributed little to their respective scales. The results indicate that these scales have an adequate capacity for detecting the three theoretically proposed learning regulation strategies in the medical education sample. Further construct validity should be sought by interpreting scale scores in relation to specific learning activities. Using established scales for measuring students' regulation strategies enables a broad empirical base for increasing knowledge on regulation strategies in relation to different disciplinary settings and contributes to theoretical development.

  9. Multiple strategies to identify HIV-positive black men who have sex with men and transgender women in New York City: a cross-sectional analysis of recruitment results.

    PubMed

    Franks, Julie; Mannheimer, Sharon B; Hirsch-Moverman, Yael; Hayes-Larson, Eleanor; Colson, Paul W; Ortega, Hugo; El-Sadr, Wafaa M

    2018-03-01

    Black men who have sex with men and transgender women are at high risk for HIV infection, but are more likely to be unaware of their infection or not in care for diagnosed HIV compared to other races. Respondent driven sampling has been advanced as a method to reach stigmatized and hidden populations for HIV testing. We compared strategies to recruit black, substance-using men who have sex with men and transgender women to identify newly diagnosed HIV infection, or those previously diagnosed but not in care. The STAR (Seek, Test, and Retain) study (ClinicalTrials.gov NCT01790360) used several recruitment strategies to identify black, substance-using men who have sex with men and transgender women with undiagnosed HIV infection or with previously diagnosed HIV infection but who were not in HIV care. Respondent-driven sampling, community-based recruitment and online advertising were used to recruit participants. Incentivized peer referral was integrated into all recruitment strategies. Participants completed interviewer-administered questionnaires and HIV testing. Demographic and HIV risk-related characteristics and recruitment strategy were summarized and stratified by HIV status. Associations were tested using Pearson's chi-squared, Fisher's exact, and Wilcoxon rank sum tests. Factors associated with HIV-positive diagnosis at p < 0.1 were included in a multivariable logistic regression model. From July 2012 through October 2015, the study enrolled 1929 participants; 96.3% men who have sex with men and 3.7% transgender women. Behavioural risk factors included recent condomless anal sex (55.6%) and recent substance use during sex (73.1%). HIV prevalence was 8.7%. In multivariable analysis, significant associations with HIV infection included being transgender; non-Hispanic black; gay/homosexual orientation; not homeless; and less likely to have insufficient income for necessities. Among recruitment strategies, respondent driven sampling was least effective in identifying HIV-positive participants. Integrating multiple recruitment strategies yielded a large sample of black men who have sex with men and transgender women at substantial risk for HIV. Respondent-driven sampling was less effective than other strategies at identifying men who have sex with men and transgender women with HIV. © 2018 The Authors. Journal of the International AIDS Society published by John Wiley & sons Ltd on behalf of the International AIDS Society.

  10. An integrative strategy for quantitative analysis of the N-glycoproteome in complex biological samples.

    PubMed

    Wang, Ji; Zhou, Chuang; Zhang, Wei; Yao, Jun; Lu, Haojie; Dong, Qiongzhu; Zhou, Haijun; Qin, Lunxiu

    2014-01-15

    The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10-10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023-0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes.

  11. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  12. Reporting pesticide assessment results to farmworker families: development, implementation, and evaluation of a risk communication strategy.

    PubMed Central

    Quandt, Sara A; Doran, Alicia M; Rao, Pamela; Hoppin, Jane A; Snively, Beverly M; Arcury, Thomas A

    2004-01-01

    The collection of environmental samples presents a responsibility to return information to the affected participants. Explaining complex and often ambiguous scientific information to a lay audience is a challenge. As shown by environmental justice research, this audience frequently has limited formal education, increasing the challenge for researchers to explain the data collected, the risk indicated by the findings, and action the affected community should take. In this study we describe the development and implementation of a risk communication strategy for environmental pesticide samples collected in the homes of Latino/a migrant and seasonal farmworkers in a community-based participatory research project. The communication strategy was developed with community input and was based on face-to-face meetings with members of participating households. Using visual displays of data effectively conveyed information about individual household contamination and placed it in the context of community findings. The lack of national reference data and definitive standards for action necessitated a simplified risk message. We review the strengths and weaknesses of such an approach and suggest areas for future research in risk communication to communities affected by environmental health risks. PMID:15064174

  13. A proposal of optimal sampling design using a modularity strategy

    NASA Astrophysics Data System (ADS)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  14. Effective surveillance strategies following a potential classical Swine Fever incursion in a remote wild pig population in North-Western Australia.

    PubMed

    Leslie, E; Cowled, B; Graeme Garner, M; Toribio, J-A L M L; Ward, M P

    2014-10-01

    Early disease detection and efficient methods of proving disease freedom can substantially improve the response to incursions of important transboundary animal diseases in previously free regions. We used a spatially explicit, stochastic disease spread model to simulate the spread of classical swine fever in wild pigs in a remote region of northern Australia and to assess the performance of disease surveillance strategies to detect infection at different time points and to delineate the size of the resulting outbreak. Although disease would likely be detected, simple random sampling was suboptimal. Radial and leapfrog sampling improved the effectiveness of surveillance at various stages of the simulated disease incursion. This work indicates that at earlier stages, radial sampling can reduce epidemic length and achieve faster outbreak delineation and control, but at later stages leapfrog sampling will outperform radial sampling in relation to supporting faster disease control with a less-extensive outbreak area. Due to the complexity of wildlife population dynamics and group behaviour, a targeted approach to surveillance needs to be implemented for the efficient use of resources and time. Using a more situation-based surveillance approach and accounting for disease distribution and the time period over which an epidemic has occurred is the best way to approach the selection of an appropriate surveillance strategy. © 2013 Blackwell Verlag GmbH.

  15. Clinically relevant advances in on-chip affinity-based electrophoresis and electrochromatography.

    PubMed

    Hou, Chenlu; Herr, Amy E

    2008-08-01

    Clinical and point-of-care disease diagnostics promise to play an important role in personalized medicine, new approaches to global health, and health monitoring. Emerging instrument platforms based on lab-on-a-chip technology can confer performance advantages successfully exploited in electrophoresis and electrochromatography to affinity-based electrokinetic separations. This review surveys lab-on-a-chip diagnostic developments in affinity-based electrokinetic separations for quantitation of proteins, integration of preparatory functions needed for subsequent analysis of diverse biological samples, and initial forays into multiplexed analyses. The technologies detailed here underpin new clinical and point-of-care diagnostic strategies. The techniques and devices promise to advance translation of until now laboratory-based sample preparation and analytical assays to near-patient settings.

  16. Recruiting Drug Using Men Who Have Sex with Men in Behavioral Intervention Trials: A Comparison of Internet and Field-based Strategies

    PubMed Central

    Parsons, Jeffrey T.; Vial, Andrea C.; Starks, Tyrel J.; Golub, Sarit A.

    2012-01-01

    Although online and field-based samples of men who have sex with men (MSM) have been compared on a variety of markers, differences in drug use have not been well examined. In addition, generalization from studies comparing recruitment methods is often limited by a focus on either HIV seropositive or seronegative MSM. We compared two New York City-based samples of MSM recruited simultaneously between July 2009 and January 2010—one sample recruited in the field (n = 2402) and one sample recruited via the Internet (n = 694). All recruitment efforts targeted men without restriction on age or self-reported HIV status. Our results show marked differences in drug and alcohol use between online and field-based samples of MSM. Specifically, men surveyed online were significantly more likely to have tried a variety of drugs, including methamphetamine, cocaine, and ecstasy. Men recruited online were also more likely to report older age, HIV positive serostatus, and “never” using condoms. Internet-based recruitment was found to be more cost-effective in terms of recruitment yield than was field-based recruitment. PMID:22684657

  17. Multilayered nonuniform sampling for three-dimensional scene representation

    NASA Astrophysics Data System (ADS)

    Lin, Huei-Yung; Xiao, Yu-Hua; Chen, Bo-Ren

    2015-09-01

    The representation of a three-dimensional (3-D) scene is essential in multiview imaging technologies. We present a unified geometry and texture representation based on global resampling of the scene. A layered data map representation with a distance-dependent nonuniform sampling strategy is proposed. It is capable of increasing the details of the 3-D structure locally and is compact in size. The 3-D point cloud obtained from the multilayered data map is used for view rendering. For any given viewpoint, image synthesis with different levels of detail is carried out using the quadtree-based nonuniformly sampled 3-D data points. Experimental results are presented using the 3-D models of reconstructed real objects.

  18. Interpolating seismic data via the POCS method based on shearlet transform

    NASA Astrophysics Data System (ADS)

    Jicheng, Liu; Yongxin, Chou; Jianjiang, Zhu

    2018-06-01

    A method based on shearlet transform and the projection onto convex sets with L0-norm constraint is proposed to interpolate irregularly sampled 2D and 3D seismic data. The 2D directional filter of shearlet transform is constructed by modulating a low-pass diamond filter pair to minimize the effect of additional edges introduced by the missing traces. In order to abate the spatial aliasing and control the maximal gap between missing traces for a 3D data cube, a 2D separable jittered sampling strategy is discussed. Finally, numerical experiments on 2D and 3D synthetic and real data with different under-sampling rates prove the validity of the proposed method.

  19. Recent approaches in sensitive enantioseparations by CE.

    PubMed

    Sánchez-Hernández, Laura; Castro-Puyana, María; Marina, María Luisa; Crego, Antonio L

    2012-01-01

    The latest strategies and instrumental improvements for enhancing the detection sensitivity in chiral analysis by CE are reviewed in this work. Following the previous reviews by García-Ruiz et al. (Electrophoresis 2006, 27, 195-212) and Sánchez-Hernández et al. (Electrophoresis 2008, 29, 237-251; Electrophoresis 2010, 31, 28-43), this review includes those papers that were published during the period from June 2009 to May 2011. These works describe the use of offline and online sample treatment techniques, online sample preconcentration techniques based on electrophoretic principles, and alternative detection systems to UV-Vis to increase the detection sensitivity. The application of the above-mentioned strategies, either alone or combined, to improve the sensitivity in the enantiomeric analysis of a broad range of samples, such as pharmaceutical, biological, food and environmental samples, enables to decrease the limits of detection up to 10⁻¹² M. The use of microchips to achieve sensitive chiral separations is also discussed. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. The Cost-Effectiveness of Cervical Self-Sampling to Improve Routine Cervical Cancer Screening: The Importance of Respondent Screening History and Compliance.

    PubMed

    Burger, Emily A; Sy, Stephen; Nygård, Mari; Kim, Jane J

    2017-01-01

    Human papillomavirus (HPV) testing allows women to self-collect cervico-vaginal cells at home (i.e., self-sampling). Using primary data from a randomized pilot study, we evaluated the long-term consequences and cost-effectiveness of using self-sampling to improve participation to routine cervical cancer screening in Norway. We compared a strategy reflecting screening participation (using reminder letters) to strategies that involved mailing self-sampling device kits to women noncompliant to screening within a 5- or 10-year period under two scenarios: (A) self-sampling respondents had moderate under-screening histories, or (B) respondents to self-sampling had moderate and severe under-screening histories. Model outcomes included quality-adjusted life-years (QALY) and lifetime costs. The "most cost-effective" strategy was identified as the strategy just below $100,000 per QALY gained. Mailing self-sampling device kits to all women noncompliant to screening within a 5- or 10-year period can be more effective and less costly than the current reminder letter policy; however, the optimal self-sampling strategy was dependent on the profile of self-sampling respondents. For example, "10-yearly self-sampling" is preferred ($95,500 per QALY gained) if "5-yearly self-sampling" could only attract moderate under-screeners; however, "5-yearly self-sampling" is preferred if this strategy could additionally attract severe under-screeners. Targeted self-sampling of noncompliers likely represents good value-for-money; however, the preferred strategy is contingent on the screening histories and compliance of respondents. The magnitude of the health benefit and optimal self-sampling strategy is dependent on the profile and behavior of respondents. Health authorities should understand these factors prior to selecting and implementing a self-sampling policy. Cancer Epidemiol Biomarkers Prev; 26(1); 95-103. ©2016 AACR. ©2016 American Association for Cancer Research.

  1. Evaluating data worth for ground-water management under uncertainty

    USGS Publications Warehouse

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.

  2. Review of sample preparation strategies for MS-based metabolomic studies in industrial biotechnology.

    PubMed

    Causon, Tim J; Hann, Stephan

    2016-09-28

    Fermentation and cell culture biotechnology in the form of so-called "cell factories" now play an increasingly significant role in production of both large (e.g. proteins, biopharmaceuticals) and small organic molecules for a wide variety of applications. However, associated metabolic engineering optimisation processes relying on genetic modification of organisms used in cell factories, or alteration of production conditions remain a challenging undertaking for improving the final yield and quality of cell factory products. In addition to genomic, transcriptomic and proteomic workflows, analytical metabolomics continues to play a critical role in studying detailed aspects of critical pathways (e.g. via targeted quantification of metabolites), identification of biosynthetic intermediates, and also for phenotype differentiation and the elucidation of previously unknown pathways (e.g. via non-targeted strategies). However, the diversity of primary and secondary metabolites and the broad concentration ranges encompassed during typical biotechnological processes means that simultaneous extraction and robust analytical determination of all parts of interest of the metabolome is effectively impossible. As the integration of metabolome data with transcriptome and proteome data is an essential goal of both targeted and non-targeted methods addressing production optimisation goals, additional sample preparation steps beyond necessary sampling, quenching and extraction protocols including clean-up, analyte enrichment, and derivatisation are important considerations for some classes of metabolites, especially those present in low concentrations or exhibiting poor stability. This contribution critically assesses the potential of current sample preparation strategies applied in metabolomic studies of industrially-relevant cell factory organisms using mass spectrometry-based platforms primarily coupled to liquid-phase sample introduction (i.e. flow injection, liquid chromatography, or capillary electrophoresis). Particular focus is placed on the selectivity and degree of enrichment attainable, as well as demands of speed, absolute quantification, robustness and, ultimately, consideration of fully-integrated bioanalytical solutions to optimise sample handling and throughput. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Analysis of training sample selection strategies for regression-based quantitative landslide susceptibility mapping methods

    NASA Astrophysics Data System (ADS)

    Erener, Arzu; Sivas, A. Abdullah; Selcuk-Kestel, A. Sevtap; Düzgün, H. Sebnem

    2017-07-01

    All of the quantitative landslide susceptibility mapping (QLSM) methods requires two basic data types, namely, landslide inventory and factors that influence landslide occurrence (landslide influencing factors, LIF). Depending on type of landslides, nature of triggers and LIF, accuracy of the QLSM methods differs. Moreover, how to balance the number of 0 (nonoccurrence) and 1 (occurrence) in the training set obtained from the landslide inventory and how to select which one of the 1's and 0's to be included in QLSM models play critical role in the accuracy of the QLSM. Although performance of various QLSM methods is largely investigated in the literature, the challenge of training set construction is not adequately investigated for the QLSM methods. In order to tackle this challenge, in this study three different training set selection strategies along with the original data set is used for testing the performance of three different regression methods namely Logistic Regression (LR), Bayesian Logistic Regression (BLR) and Fuzzy Logistic Regression (FLR). The first sampling strategy is proportional random sampling (PRS), which takes into account a weighted selection of landslide occurrences in the sample set. The second method, namely non-selective nearby sampling (NNS), includes randomly selected sites and their surrounding neighboring points at certain preselected distances to include the impact of clustering. Selective nearby sampling (SNS) is the third method, which concentrates on the group of 1's and their surrounding neighborhood. A randomly selected group of landslide sites and their neighborhood are considered in the analyses similar to NNS parameters. It is found that LR-PRS, FLR-PRS and BLR-Whole Data set-ups, with order, yield the best fits among the other alternatives. The results indicate that in QLSM based on regression models, avoidance of spatial correlation in the data set is critical for the model's performance.

  4. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  5. Automatic detection of regions of interest in mammographic images

    NASA Astrophysics Data System (ADS)

    Cheng, Erkang; Ling, Haibin; Bakic, Predrag R.; Maidment, Andrew D. A.; Megalooikonomou, Vasileios

    2011-03-01

    This work is a part of our ongoing study aimed at comparing the topology of anatomical branching structures with the underlying image texture. Detection of regions of interest (ROIs) in clinical breast images serves as the first step in development of an automated system for image analysis and breast cancer diagnosis. In this paper, we have investigated machine learning approaches for the task of identifying ROIs with visible breast ductal trees in a given galactographic image. Specifically, we have developed boosting based framework using the AdaBoost algorithm in combination with Haar wavelet features for the ROI detection. Twenty-eight clinical galactograms with expert annotated ROIs were used for training. Positive samples were generated by resampling near the annotated ROIs, and negative samples were generated randomly by image decomposition. Each detected ROI candidate was given a confidences core. Candidate ROIs with spatial overlap were merged and their confidence scores combined. We have compared three strategies for elimination of false positives. The strategies differed in their approach to combining confidence scores by summation, averaging, or selecting the maximum score.. The strategies were compared based upon the spatial overlap with annotated ROIs. Using a 4-fold cross-validation with the annotated clinical galactographic images, the summation strategy showed the best performance with 75% detection rate. When combining the top two candidates, the selection of maximum score showed the best performance with 96% detection rate.

  6. Lateral Flow Assay Based on Paper-Hydrogel Hybrid Material for Sensitive Point-of-Care Detection of Dengue Virus.

    PubMed

    Choi, Jane Ru; Yong, Kar Wey; Tang, Ruihua; Gong, Yan; Wen, Ting; Yang, Hui; Li, Ang; Chia, Yook Chin; Pingguan-Murphy, Belinda; Xu, Feng

    2017-01-01

    Paper-based devices have been broadly used for the point-of-care detection of dengue viral nucleic acids due to their simplicity, cost-effectiveness, and readily observable colorimetric readout. However, their moderate sensitivity and functionality have limited their applications. Despite the above-mentioned advantages, paper substrates are lacking in their ability to control fluid flow, in contrast to the flow control enabled by polymer substrates (e.g., agarose) with readily tunable pore size and porosity. Herein, taking the benefits from both materials, the authors propose a strategy to create a hybrid substrate by incorporating agarose into the test strip to achieve flow control for optimal biomolecule interactions. As compared to the unmodified test strip, this strategy allows sensitive detection of targets with an approximately tenfold signal improvement. Additionally, the authors showcase the potential of functionality improvement by creating multiple test zones for semi-quantification of targets, suggesting that the number of visible test zones is directly proportional to the target concentration. The authors further demonstrate the potential of their proposed strategy for clinical assessment by applying it to their prototype sample-to-result test strip to sensitively and semi-quantitatively detect dengue viral RNA from the clinical blood samples. This proposed strategy holds significant promise for detecting various targets for diverse future applications. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Evaluation of a Urine Pooling Strategy for the Rapid and Cost-Efficient Prevalence Classification of Schistosomiasis.

    PubMed

    Lo, Nathan C; Coulibaly, Jean T; Bendavid, Eran; N'Goran, Eliézer K; Utzinger, Jürg; Keiser, Jennifer; Bogoch, Isaac I; Andrews, Jason R

    2016-08-01

    A key epidemiologic feature of schistosomiasis is its focal distribution, which has important implications for the spatial targeting of preventive chemotherapy programs. We evaluated the diagnostic accuracy of a urine pooling strategy using a point-of-care circulating cathodic antigen (POC-CCA) cassette test for detection of Schistosoma mansoni, and employed simulation modeling to test the classification accuracy and efficiency of this strategy in determining where preventive chemotherapy is needed in low-endemicity settings. We performed a cross-sectional study involving 114 children aged 6-15 years in six neighborhoods in Azaguié Ahoua, south Côte d'Ivoire to characterize the sensitivity and specificity of the POC-CCA cassette test with urine samples that were tested individually and in pools of 4, 8, and 12. We used a Bayesian latent class model to estimate test characteristics for individual POC-CCA and quadruplicate Kato-Katz thick smears on stool samples. We then developed a microsimulation model and used lot quality assurance sampling to test the performance, number of tests, and total cost per school for each pooled testing strategy to predict the binary need for school-based preventive chemotherapy using a 10% prevalence threshold for treatment. The sensitivity of the urine pooling strategy for S. mansoni diagnosis using pool sizes of 4, 8, and 12 was 85.9%, 79.5%, and 65.4%, respectively, when POC-CCA trace results were considered positive, and 61.5%, 47.4%, and 30.8% when POC-CCA trace results were considered negative. The modeled specificity ranged from 94.0-97.7% for the urine pooling strategies (when POC-CCA trace results were considered negative). The urine pooling strategy, regardless of the pool size, gave comparable and often superior classification performance to stool microscopy for the same number of tests. The urine pooling strategy with a pool size of 4 reduced the number of tests and total cost compared to classical stool microscopy. This study introduces a method for rapid and efficient S. mansoni prevalence estimation through examining pooled urine samples with POC-CCA as an alternative to widely used stool microscopy.

  8. Evaluation of a Urine Pooling Strategy for the Rapid and Cost-Efficient Prevalence Classification of Schistosomiasis

    PubMed Central

    Coulibaly, Jean T.; Bendavid, Eran; N’Goran, Eliézer K.; Utzinger, Jürg; Keiser, Jennifer; Bogoch, Isaac I.; Andrews, Jason R.

    2016-01-01

    Background A key epidemiologic feature of schistosomiasis is its focal distribution, which has important implications for the spatial targeting of preventive chemotherapy programs. We evaluated the diagnostic accuracy of a urine pooling strategy using a point-of-care circulating cathodic antigen (POC-CCA) cassette test for detection of Schistosoma mansoni, and employed simulation modeling to test the classification accuracy and efficiency of this strategy in determining where preventive chemotherapy is needed in low-endemicity settings. Methodology We performed a cross-sectional study involving 114 children aged 6–15 years in six neighborhoods in Azaguié Ahoua, south Côte d’Ivoire to characterize the sensitivity and specificity of the POC-CCA cassette test with urine samples that were tested individually and in pools of 4, 8, and 12. We used a Bayesian latent class model to estimate test characteristics for individual POC-CCA and quadruplicate Kato-Katz thick smears on stool samples. We then developed a microsimulation model and used lot quality assurance sampling to test the performance, number of tests, and total cost per school for each pooled testing strategy to predict the binary need for school-based preventive chemotherapy using a 10% prevalence threshold for treatment. Principal Findings The sensitivity of the urine pooling strategy for S. mansoni diagnosis using pool sizes of 4, 8, and 12 was 85.9%, 79.5%, and 65.4%, respectively, when POC-CCA trace results were considered positive, and 61.5%, 47.4%, and 30.8% when POC-CCA trace results were considered negative. The modeled specificity ranged from 94.0–97.7% for the urine pooling strategies (when POC-CCA trace results were considered negative). The urine pooling strategy, regardless of the pool size, gave comparable and often superior classification performance to stool microscopy for the same number of tests. The urine pooling strategy with a pool size of 4 reduced the number of tests and total cost compared to classical stool microscopy. Conclusions/Significance This study introduces a method for rapid and efficient S. mansoni prevalence estimation through examining pooled urine samples with POC-CCA as an alternative to widely used stool microscopy. PMID:27504954

  9. Culturally Specific Youth Substance Abuse Resistance Skills: Applicability across the U.S.-Mexico Border

    PubMed Central

    Marsiglia, Flavio F.; Kulis, Stephen; Rodriguez, Gregorio Martinez; Becerra, David; Castillo, Jason

    2009-01-01

    This study tests the applicability among adolescents in Mexico of the keepin' it REAL (refuse, explain, avoid, and leave) strategies that are common and effective ways that U.S. youth resist substance use. Following a social learning, communication competence and ecological theory integrated approach, the study draws on self-reported questionnaire data from a non-probability sample of 327 adolescents attending two public high schools in Monterrey, Nuevo León. Multivariate regressions were used to test whether the respondents' use of the REAL strategies by the participants could be predicted by key demographic variables. Separate models were estimated for the frequency of use of each strategy and for different substances. Findings indicate that most adolescents in this sample utilized each of the REAL strategies as well as other strategies to respond to offers of alcohol, cigarettes, or marijuana. Mexican and U.S. youth residing close to the US border appear to use similar drug resistance strategies. Use of the strategies varied considerably by the level of exposure to offers, but only minimally by gender and age. There were no notable differences by socioeconomic status or academic performance. Implications for prevention science, social work practice and social work research are discussed in the context of the bi-national border region and the applicability and prospect for dissemination of U.S. evidence based youth substance use prevention interventions. PMID:19924268

  10. Culturally Specific Youth Substance Abuse Resistance Skills: Applicability across the U.S.-Mexico Border.

    PubMed

    Marsiglia, Flavio F; Kulis, Stephen; Rodriguez, Gregorio Martinez; Becerra, David; Castillo, Jason

    2009-03-01

    This study tests the applicability among adolescents in Mexico of the keepin' it REAL (refuse, explain, avoid, and leave) strategies that are common and effective ways that U.S. youth resist substance use. Following a social learning, communication competence and ecological theory integrated approach, the study draws on self-reported questionnaire data from a non-probability sample of 327 adolescents attending two public high schools in Monterrey, Nuevo León. Multivariate regressions were used to test whether the respondents' use of the REAL strategies by the participants could be predicted by key demographic variables. Separate models were estimated for the frequency of use of each strategy and for different substances. Findings indicate that most adolescents in this sample utilized each of the REAL strategies as well as other strategies to respond to offers of alcohol, cigarettes, or marijuana. Mexican and U.S. youth residing close to the US border appear to use similar drug resistance strategies. Use of the strategies varied considerably by the level of exposure to offers, but only minimally by gender and age. There were no notable differences by socioeconomic status or academic performance. Implications for prevention science, social work practice and social work research are discussed in the context of the bi-national border region and the applicability and prospect for dissemination of U.S. evidence based youth substance use prevention interventions.

  11. Successful minority recruitment and adherence in physical activity Internet-based research: the WIN study.

    PubMed

    Frierson, Georita M; Morrow, James R; Vidales, Andrew

    2012-01-01

    Researchers studying physical activity often face challenges dealing with recruitment and resources, particularly when conducting longitudinal Internet-based research. Commonly raised methodological problems such as minority recruitment, participant commitment, and participant-staff involvement are addressed through a theoretically driven recruitment and adherence protocol in The Women's Exercise Injuries: Incidence and Risk Factors (WIN) Internet-based study. The objectives of this paper were to review and suggest solutions to problems of: (1) low recruitment of diverse samples, (2) low adherence, and (3) staffing needs. We recruited 1303 community-dwelling women and followed them through a multiple-phase, longitudinal, Internet-based study. Recruitment and adherence data were analyzed through descriptive methods and logistic regressions to examine participant adherence and sociodemographic factors and predictors of who entered the long-term phase of the study. We successfully retained 71.6% of the sample through 4 recruitment phases. Twenty-seven percent of the initially recruited sample was racial/ethnically diverse, 24% began the long-term phase, and 23% completed. Several strategies to enhance participant commitment were successfully used during the practice phase, providing a successful, low staff to participant ratio. Logistic regression indicated being married, being older, and having greater Internet skills were predictive of successfully entering the long-term phase of the study. Recruitment and compliance protocols were successful in meeting overall and racial/ethnic enrollment and recruitment goals. The theoretically based practice phase techniques were successful in re-engaging noncompliant participants. Strategies for minority enrollment and compliance are evaluated.

  12. A randomized controlled trial of strong minds: A school-based mental health program combining acceptance and commitment therapy and positive psychology.

    PubMed

    Burckhardt, Rowan; Manicavasagar, Vijaya; Batterham, Philip J; Hadzi-Pavlovic, Dusan

    2016-08-01

    To date, most early intervention programs have been based on emotion regulation strategies that address dysfunctional cognitive appraisals, problem-solving skills, and rumination. Another emotion regulation strategy, 'acceptance' training, has largely been overlooked. To examine the efficacy of this strategy, a school-based mental health program combining positive psychology with acceptance and commitment therapy (Strong Minds) was evaluated in a randomized controlled trial with a sample of 267 Year 10 and 11 high-school students in Sydney, Australia. Mixed models for repeated measures examined whether the program led to reductions in symptoms amongst students who commenced the program with high depression, anxiety, and stress scores, and increased wellbeing scores amongst all students. Results demonstrated that compared to controls, participants in the Strong Minds condition with elevated symptom scores (n=63) reported significant reductions in depression (p=.047), stress (p=.01), and composite depression/anxiety symptoms (p=.02) with medium to strong effect sizes (Cohen's d=0.53, 0.74, and 0.57, respectively). Increased wellbeing (p=.03) in the total sample and decreased anxiety scores (p=.048) for students with elevated symptoms were significant for Year 10 students with medium effect sizes (Cohen's d=0.43 and 0.54, respectively). This study tentatively suggests that including the emotion regulation strategy of acceptance in early intervention programs may be effective in reducing symptoms and improving wellbeing in high school students. Further research to investigate the generalizability of these findings is warranted. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  13. A comparative research of different ensemble surrogate models based on set pair analysis for the DNAPL-contaminated aquifer remediation strategy optimization.

    PubMed

    Hou, Zeyu; Lu, Wenxi; Xue, Haibo; Lin, Jin

    2017-08-01

    Surrogate-based simulation-optimization technique is an effective approach for optimizing the surfactant enhanced aquifer remediation (SEAR) strategy for clearing DNAPLs. The performance of the surrogate model, which is used to replace the simulation model for the aim of reducing computation burden, is the key of corresponding researches. However, previous researches are generally based on a stand-alone surrogate model, and rarely make efforts to improve the approximation accuracy of the surrogate model to the simulation model sufficiently by combining various methods. In this regard, we present set pair analysis (SPA) as a new method to build ensemble surrogate (ES) model, and conducted a comparative research to select a better ES modeling pattern for the SEAR strategy optimization problems. Surrogate models were developed using radial basis function artificial neural network (RBFANN), support vector regression (SVR), and Kriging. One ES model is assembling RBFANN model, SVR model, and Kriging model using set pair weights according their performance, and the other is assembling several Kriging (the best surrogate modeling method of three) models built with different training sample datasets. Finally, an optimization model, in which the ES model was embedded, was established to obtain the optimal remediation strategy. The results showed the residuals of the outputs between the best ES model and simulation model for 100 testing samples were lower than 1.5%. Using an ES model instead of the simulation model was critical for considerably reducing the computation time of simulation-optimization process and maintaining high computation accuracy simultaneously. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. The weight management strategies inventory (WMSI). Development of a new measurement instrument, construct validation, and association with dieting success.

    PubMed

    Keller, Carmen; Siegrist, Michael

    2015-09-01

    In an obesogenic environment, people have to adopt effective weight management strategies to successfully gain or maintain normal body weight. Little is known about the strategies used by the general population in daily life. Due to the lack of a comprehensive measurement instrument to assess conceptually different strategies with various scales, we developed the weight management strategies inventory (WMSI). In study 1, we collected 19 weight management strategies from research on self-regulation of food intake and successful weight loss and maintenance, as well as from expert interviews. We classified them under the five main categories of health self-regulation strategies - goal setting and monitoring, prospection and planning, automating behavior, construal, and inhibition. We formulated 93 items. In study 2, we developed the WMSI in a random sample from the general population (N = 658), using reliability and exploratory factor analysis. This resulted in 19 factors with 63 items, representing the 19 strategies. In study 3, we tested the 19-factor structure in a quota (age, gender) sample from the general population (N = 616), using confirmatory factor analysis. A good model fit (CFI = .918; RMSEA = .043) was revealed. Reliabilities and construct validity were high. Positive correlations of most strategies with dieting success and negative correlations of some strategies with body mass index were found among dieters (N = 292). Study 4 (N = 162) revealed a good test-retest reliability. The WMSI assesses theoretically derived, evidence-based, and conceptually different weight management strategies with different scales that have good psychometric characteristics. The scales can also be used for pre- and post measures in intervention studies. The scales provide insights into the general population's weight management strategies and facilitate tailoring and evaluating health communication. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Utilizing the ultrasensitive Schistosoma up-converting phosphor lateral flow circulating anodic antigen (UCP-LF CAA) assay for sample pooling-strategies.

    PubMed

    Corstjens, Paul L A M; Hoekstra, Pytsje T; de Dood, Claudia J; van Dam, Govert J

    2017-11-01

    Methodological applications of the high sensitivity genus-specific Schistosoma CAA strip test, allowing detection of single worm active infections (ultimate sensitivity), are discussed for efficient utilization in sample pooling strategies. Besides relevant cost reduction, pooling of samples rather than individual testing can provide valuable data for large scale mapping, surveillance, and monitoring. The laboratory-based CAA strip test utilizes luminescent quantitative up-converting phosphor (UCP) reporter particles and a rapid user-friendly lateral flow (LF) assay format. The test includes a sample preparation step that permits virtually unlimited sample concentration with urine, reaching ultimate sensitivity (single worm detection) at 100% specificity. This facilitates testing large urine pools from many individuals with minimal loss of sensitivity and specificity. The test determines the average CAA level of the individuals in the pool thus indicating overall worm burden and prevalence. When requiring test results at the individual level, smaller pools need to be analysed with the pool-size based on expected prevalence or when unknown, on the average CAA level of a larger group; CAA negative pools do not require individual test results and thus reduce the number of tests. Straightforward pooling strategies indicate that at sub-population level the CAA strip test is an efficient assay for general mapping, identification of hotspots, determination of stratified infection levels, and accurate monitoring of mass drug administrations (MDA). At the individual level, the number of tests can be reduced i.e. in low endemic settings as the pool size can be increased as opposed to prevalence decrease. At the sub-population level, average CAA concentrations determined in urine pools can be an appropriate measure indicating worm burden. Pooling strategies allowing this type of large scale testing are feasible with the various CAA strip test formats and do not affect sensitivity and specificity. It allows cost efficient stratified testing and monitoring of worm burden at the sub-population level, ideally for large-scale surveillance generating hard data for performance of MDA programs and strategic planning when moving towards transmission-stop and elimination.

  16. Chemometrics-enhanced high performance liquid chromatography-diode array detection strategy for simultaneous determination of eight co-eluted compounds in ten kinds of Chinese teas using second-order calibration method based on alternating trilinear decomposition algorithm.

    PubMed

    Yin, Xiao-Li; Wu, Hai-Long; Gu, Hui-Wen; Zhang, Xiao-Hua; Sun, Yan-Mei; Hu, Yong; Liu, Lu; Rong, Qi-Ming; Yu, Ru-Qin

    2014-10-17

    In this work, an attractive chemometrics-enhanced high performance liquid chromatography-diode array detection (HPLC-DAD) strategy was proposed for simultaneous and fast determination of eight co-eluted compounds including gallic acid, caffeine and six catechins in ten kinds of Chinese teas by using second-order calibration method based on alternating trilinear decomposition (ATLD) algorithm. This new strategy proved to be a useful tool for handling the co-eluted peaks, uncalibrated interferences and baseline drifts existing in the process of chromatographic separation, which benefited from the "second-order advantages", making the determination of gallic acid, caffeine and six catechins in tea infusions within 8 min under a simple mobile phase condition. The average recoveries of the analytes on two selected tea samples ranged from 91.7 to 103.1% with standard deviations (SD) ranged from 1.9 to 11.9%. Figures of merit including sensitivity (SEN), selectivity (SEL), root-mean-square error of prediction (RMSEP) and limit of detection (LOD) have been calculated to validate the accuracy of the proposed method. To further confirm the reliability of the method, a multiple reaction monitoring (MRM) method based on LC-MS/MS was employed for comparison and the obtained results of both methods were consistent with each other. Furthermore, as a universal strategy, this new proposed analytical method was applied for the determination of gallic acid, caffeine and catechins in several other kinds of Chinese teas, including different levels and varieties. Finally, based on the quantitative results, principal component analysis (PCA) was used to conduct a cluster analysis for these Chinese teas. The green tea, Oolong tea and Pu-erh raw tea samples were classified successfully. All results demonstrated that the proposed method is accurate, sensitive, fast, universal and ideal for the rapid, routine analysis and discrimination of gallic acid, caffeine and catechins in Chinese tea samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  18. Synchronization of hybrid coupled reaction-diffusion neural networks with time delays via generalized intermittent control with spacial sampled-data.

    PubMed

    Lu, Binglong; Jiang, Haijun; Hu, Cheng; Abdurahman, Abdujelil

    2018-05-04

    The exponential synchronization of hybrid coupled reaction-diffusion neural networks with time delays is discussed in this article. At first, a generalized intermittent control with spacial sampled-data is introduced, which is intermittent in time and data sampling in space. This type of control strategy not only can unify the traditional periodic intermittent control and the aperiodic case, but also can lower the update rate of the controller in both temporal and spatial domains. Next, based on the designed control protocol and the Lyapunov-Krasovskii functional approach, some novel and readily verified criteria are established to guarantee the exponential synchronization of the considered networks. These criteria depend on the diffusion coefficients, coupled strengths, time delays as well as control parameters. Finally, the effectiveness of the proposed control strategy is shown by a numerical example. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Comparing decision making between cancer patients and the general population: thoughts, emotions, or social influence?

    PubMed

    Yang, Z Janet; McComas, Katherine A; Gay, Geri K; Leonard, John P; Dannenberg, Andrew J; Dillon, Hildy

    2012-01-01

    This study extends a risk information seeking and processing model to explore the relative effect of cognitive processing strategies, positive and negative emotions, and normative beliefs on individuals' decision making about potential health risks. Most previous research based on this theoretical framework has examined environmental risks. Applying this risk communication model to study health decision making presents an opportunity to explore theoretical boundaries of the model, while also bringing this research to bear on a pressing medical issue: low enrollment in clinical trials. Comparative analysis of data gathered from 2 telephone surveys of a representative national sample (n = 500) and a random sample of cancer patients (n = 411) indicated that emotions played a more substantive role in cancer patients' decisions to enroll in a potential trial, whereas cognitive processing strategies and normative beliefs had greater influences on the decisions of respondents from the national sample.

  20. [Use of blood lead data to evaluate and prevent childhood lead poisoning in Latin America].

    PubMed

    Romieu, Isabelle

    2003-01-01

    Exposure to lead is a widespread and serious threat to the health of children in Latin America. Health officials should monitor sources of exposure and health outcomes to design, implement, and evaluate prevention and control activities. To evaluate the magnitude of lead as a public health problem, three key elements must be defined: I) the potential sources of exposure, 2) the indicators to evaluate health effects and environmental exposure, and 3) the sampling methods for the population at risk. Several strategies can be used to select the study population depending on the study objectives, the time limitations, and the available resources. If the objective is to evaluate the magnitude and sources of the problem, the following sampling methods can be used: I) population-based random sampling; 2) facility-based random sampling within hospitals, daycare centers, or schools; 3) target sampling of high risk groups; 4) convenience sampling of volunteers; and 5) case reporting (which can lead to the identification of populations at risk and sources of exposures). For all sampling methods, information gathering should include the use of a questionnaire to collect general information on the participants and on potential local sources of exposure, as well as the collection of biological samples. In interpreting data, one should consider the type of sampling used and the non-response rates, as well as factors that might influence blood lead measurements, such as age and seasonal variability. Blood lead measurements should be integrated in an overall strategy to prevent lead toxicity in children. The English version of this paper is available at: http://www.insp.mx/salud/index.html.

  1. The theory and implementation of a high quality pulse width modulated waveform synthesiser applicable to voltage FED inverters

    NASA Astrophysics Data System (ADS)

    Lower, Kim Nigel

    1985-03-01

    Modulation processes associated with the digital implementation of pulse width modulation (PWM) switching strategies were examined. A software package based on a portable turnkey structure is presented. Waveform synthesizer implementation techniques are reviewed. A three phase PWM waveform synthesizer for voltage fed inverters was realized. It is based on a constant carrier frequency of 18 kHz and a regular sample, single edge, asynchronous PWM switching scheme. With high carrier frequencies, it is possible to utilize simple switching strategies and as a consequence, many advantages are highlighted, emphasizing the importance to industrial and office markets.

  2. Pre-analytic evaluation of volumetric absorptive microsampling and integration in a mass spectrometry-based metabolomics workflow.

    PubMed

    Volani, Chiara; Caprioli, Giulia; Calderisi, Giovanni; Sigurdsson, Baldur B; Rainer, Johannes; Gentilini, Ivo; Hicks, Andrew A; Pramstaller, Peter P; Weiss, Guenter; Smarason, Sigurdur V; Paglia, Giuseppe

    2017-10-01

    Volumetric absorptive microsampling (VAMS) is a novel approach that allows single-drop (10 μL) blood collection. Integration of VAMS with mass spectrometry (MS)-based untargeted metabolomics is an attractive solution for both human and animal studies. However, to boost the use of VAMS in metabolomics, key pre-analytical questions need to be addressed. Therefore, in this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. We first evaluated the best extraction procedure for the polar metabolome and found that the highest number and amount of metabolites were recovered upon extraction with acetonitrile/water (70:30). In contrast, basic conditions (pH 9) resulted in divergent metabolite profiles mainly resulting from the extraction of intracellular metabolites originating from red blood cells. In addition, the prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but once the VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months. The time used for drying the sample did also affect the metabolome. In fact, some metabolites were rapidly degraded or accumulated in the sample during the first 48 h at room temperature, indicating that a longer drying step will significantly change the concentration in the sample. Graphical abstract Volumetric absorptive microsampling (VAMS) is a novel technology that allows single-drop blood collection and, in combination with mass spectrometry (MS)-based untargeted metabolomics, represents an attractive solution for both human and animal studies. In this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. The latter revealed that prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but if VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months.

  3. The correspondence of surface climate parameters with satellite and terrain data

    NASA Technical Reports Server (NTRS)

    Dozier, Jeff; Davis, Frank

    1987-01-01

    One of the goals of the research was to develop a ground sampling stragegy for calibrating remotely sensed measurements of surface climate parameters. The initial sampling strategy involved the stratification of the terrain based on important ancillary surface variables such as slope, exposure, insolation, geology, drainage, fire history, etc. For a spatially heterogeneous population, sampling error is reduced and efficiency increased by stratification of the landscape into more homogeneous sub-areas and by employing periodic random spacing of samples. These concepts were applied in the initial stratification of the study site for the purpose of locating and allocating instrumentation.

  4. Negative caregiver strategies and psychopathology in urban, African-American young adults.

    PubMed

    Koenig, Amy L; Ialongo, Nicholas; Wagner, Barry M; Poduska, Jeanne; Kellam, Sheppard

    2002-12-01

    There were three aims: (1). assess the prevalence of reported exposure to negative caregiver strategies in a community-based African-American population, (2). examine the sources of variation in caregiver parenting strategies, including demographic variables and child characteristics, (3). investigate whether mental disorders in young adulthood may differ based on reported degree of exposure to negative strategies. The participants were 1197 African-Americans involved in a 1999-2001 young adult follow-up (age M=19.6, SD=.6) of an evaluation of school-based interventions in the Baltimore, MD metropolitan area. Measures included teacher-report of child aggression in first grade, parent-report of demographic variables in first and sixth grade, and young adult self-report of symptomatology, suicidal behaviors, and childhood caregiver discipline strategies. Fifty-four percent of the sample reported some use of physical discipline by caregivers. Lower family income and younger caregiver age, as well as teacher reports of child aggression, were related to reports of caregiver's high use of negative strategies. In addition, young adults who reported a high rate of negative caregiver strategies had a significantly increased risk for psychopathology and were over twice as likely to have experienced a history of suicidal ideation than those reporting low exposure. The results demonstrate the importance of examining variation in this population, with the poorest and the youngest using negative parenting strategies more frequently. In addition, the present study replicated previous findings of the link between negative caregiver discipline strategies and psychopathology. This association appears robust across diverse populations. The implications for preventive interventions are discussed.

  5. Coping Work Strategies and Job Satisfaction Among Iranian Nurses

    PubMed Central

    Ghiyasvandian, Shahrzad; Adera Gebra, Addis

    2014-01-01

    Context: Nursing is a stressful job that could create physical and psychological disorders. Many studies presented information on stress, effects of coping strategies, and job satisfaction of nurses within health setting. We aimed to identify and describe nursing stresses, coping strategies and job satisfaction of Iranian nurses who are working or worked in different wards. Evidence Acquisition: In this review, we studied peer-reviewed journal articles on the field of stress, coping strategies and job satisfaction in nursing practice, especially Iranian nurses, which were published between 2000 and 2013. In this regard, we searched databases of PubMed, Elsevier, Google, BMJ, PMC, and MEDLINE. Results: The majority of the studies (60%) had analyzed the effect of coping strategies, experiences and perception of job-related stresses in Iranian nurses working in hospitals. In some of the reviewed studies (60%), the majority of the samples enrolled Iranian nurses. Forty percent of studies selected a maximum sample size of 565 (44%) participants in 2011. Nursing stress scale employed at 30% of the studies was the most commonly used strategy. This reviewed studies also revealed a combined measurement (60% of studies), based on categorical stress measurement, effects of coping strategies, and job satisfaction methods. Three studies explored the relationship between job stress and job satisfaction. For instance, the majority (74.4%) of nurses reported job satisfaction. Conclusions: Effect of coping strategies and job satisfaction on Iranian nurses is a well-accepted issue and has important positive outcomes on several areas of health discipline. PMID:25068050

  6. Sample-Based Surface Coloring

    PubMed Central

    Bürger, Kai; Krüger, Jens; Westermann, Rüdiger

    2011-01-01

    In this paper, we present a sample-based approach for surface coloring, which is independent of the original surface resolution and representation. To achieve this, we introduce the Orthogonal Fragment Buffer (OFB)—an extension of the Layered Depth Cube—as a high-resolution view-independent surface representation. The OFB is a data structure that stores surface samples at a nearly uniform distribution over the surface, and it is specifically designed to support efficient random read/write access to these samples. The data access operations have a complexity that is logarithmic in the depth complexity of the surface. Thus, compared to data access operations in tree data structures like octrees, data-dependent memory access patterns are greatly reduced. Due to the particular sampling strategy that is employed to generate an OFB, it also maintains sample coherence, and thus, exhibits very good spatial access locality. Therefore, OFB-based surface coloring performs significantly faster than sample-based approaches using tree structures. In addition, since in an OFB, the surface samples are internally stored in uniform 2D grids, OFB-based surface coloring can efficiently be realized on the GPU to enable interactive coloring of high-resolution surfaces. On the OFB, we introduce novel algorithms for color painting using volumetric and surface-aligned brushes, and we present new approaches for particle-based color advection along surfaces in real time. Due to the intermediate surface representation we choose, our method can be used to color polygonal surfaces as well as any other type of surface that can be sampled. PMID:20616392

  7. Red-shouldered hawk occupancy surveys in central Minnesota, USA

    USGS Publications Warehouse

    Henneman, C.; McLeod, M.A.; Andersen, D.E.

    2007-01-01

    Forest-dwelling raptors are often difficult to detect because many species occur at low density or are secretive. Broadcasting conspecific vocalizations can increase the probability of detecting forest-dwelling raptors and has been shown to be an effective method for locating raptors and assessing their relative abundance. Recent advances in statistical techniques based on presence-absence data use probabilistic arguments to derive probability of detection when it is <1 and to provide a model and likelihood-based method for estimating proportion of sites occupied. We used these maximum-likelihood models with data from red-shouldered hawk (Buteo lineatus) call-broadcast surveys conducted in central Minnesota, USA, in 1994-1995 and 2004-2005. Our objectives were to obtain estimates of occupancy and detection probability 1) over multiple sampling seasons (yr), 2) incorporating within-season time-specific detection probabilities, 3) with call type and breeding stage included as covariates in models of probability of detection, and 4) with different sampling strategies. We visited individual survey locations 2-9 times per year, and estimates of both probability of detection (range = 0.28-0.54) and site occupancy (range = 0.81-0.97) varied among years. Detection probability was affected by inclusion of a within-season time-specific covariate, call type, and breeding stage. In 2004 and 2005 we used survey results to assess the effect that number of sample locations, double sampling, and discontinued sampling had on parameter estimates. We found that estimates of probability of detection and proportion of sites occupied were similar across different sampling strategies, and we suggest ways to reduce sampling effort in a monitoring program.

  8. Integrating Field-Based Research into the Classroom: An Environmental Sampling Exercise

    ERIC Educational Resources Information Center

    DeSutter, T.; Viall, E.; Rijal, I.; Murdoff, M.; Guy, A.; Pang, X.; Koltes, S.; Luciano, R.; Bai, X.; Zitnick, K.; Wang, S.; Podrebarac, F.; Casey, F.; Hopkins, D.

    2010-01-01

    A field-based, soil methods, and instrumentation course was developed to expose graduate students to numerous strategies for measuring soil parameters. Given the northern latitude of North Dakota State University and the rapid onset of winter, this course met once per week for the first 8 weeks of the fall semester and centered on the field as a…

  9. Construction and Validation of the Clinical Judgment Skill Inventory: Clinical Judgment Skill Competencies That Measure Counselor Debiasing Techniques

    ERIC Educational Resources Information Center

    Austin, Bryan S.; Leahy, Michael J.

    2015-01-01

    Purpose: To construct and validate a new self-report instrument, the Clinical Judgment Skill Inventory (CJSI), inclusive of clinical judgment skill competencies that address counselor biases and evidence-based strategies. Method: An Internet-based survey design was used and an exploratory factor analysis was performed on a sample of rehabilitation…

  10. An Examination of Strategies for the Prevention of Gender-Based Violence at Four-Year Institutions of Higher Education

    ERIC Educational Resources Information Center

    Kafonek, Katherine; Richards, Tara N.

    2017-01-01

    Although gender-based violence prevention programs at institutions of higher education (IHEs) are mandated by federal legislation, research focusing on the prevalence or content of programming is limited. The present exploratory research examines campus websites for a nationally representative sample of Title IX eligible IHEs that offer at least a…

  11. Novel highly-performing immunosensor-based strategy for ochratoxin A detection in wine samples.

    PubMed

    Prieto-Simón, Beatriz; Campàs, Mònica; Marty, Jean-Louis; Noguer, Thierry

    2008-02-28

    The increasing concern about ochratoxin A (OTA) contamination of different food and feedstuffs demands high-performing detection techniques for quality assessment. Two indirect competitive enzyme-linked immunosorbent assay (ELISA) strategies were investigated for the development of OTA electrochemical immunosensors based on different OTA immobilisation procedures. Immunosensors based on avidin/biotin-OTA showed enhanced performance characteristics compared to those based on the adsorption of bovine serum albumin (BSA)-OTA conjugate. Performance of polyclonal (PAb) and monoclonal (MAb) antibodies against OTA was compared, showing at least one-order of magnitude lower IC(50) values when working with MAb. Alkaline phosphatase (ALP)- and horseradish peroxidase (HRP)-labelled secondary antibodies were evaluated. Both conjugates led to similar results when working with OTA standard solutions in buffer. However, whereas electroactive interferences present in spiked wine samples did not affect HRP-labelled immunosensors (4% slope deviation), they were likely oxidised at 0.225 V versus Ag/AgCl, the working potential for ALP-labelled immunosensors (25% slope deviation). Considering 80% of antibody binding as the limit of detection, values of 0.7 and 0.3 ng/mL for HRP- and ALP-labelled immunosensors respectively, validate these immunosensors as useful screening tools to assess OTA levels in wine.

  12. Purification and Quantification of an Isomeric Compound in a Mixture by Collisional Excitation in Multistage Mass Spectrometry Experiments.

    PubMed

    Jeanne Dit Fouque, Dany; Maroto, Alicia; Memboeuf, Antony

    2016-11-15

    The differentiation, characterization, and quantification of isomers and/or isobars in mixtures is a recurrent problem in mass spectrometry and more generally in analytical chemistry. Here we present a new strategy to assess the purity of a compound that is susceptible to be contaminated with another isomeric side-product in trace levels. Providing one of the isomers is available as pure sample, this new strategy allows the detection of isomeric contamination. This is done thanks to a "gas-phase collisional purification" inside an ion trap mass spectrometer paving the way for an improved analysis of at least similar samples. This strategy consists in using collision induced dissociation (CID) multistage mass spectrometry (MS 2 and MS 3 ) experiments and the survival yield (SY) technique. It has been successfully applied to mixtures of cyclic poly( L -lactide) (PLA) with increasing amounts of its linear topological isomer. Purification in gas phase of PLA mixtures was established based on SY curves obtained in MS 3 mode: all samples gave rise to the same SY curve corresponding then to the pure cyclic component. This new strategy was sensitive enough to detect traces of linear PLA (<3%) in a sample of cyclic PLA that was supposedly pure according to other characterization techniques ( 1 H NMR, MALDI-HRMS, and size-exclusion chromatography). Moreover, in this case, the presence of linear isomer was undetectable according to MS/MS or MS/MS/MS analysis only as fragment ions are also of the same m/z values. This type of approach could easily be implemented in hyphenated mass spectrometric techniques to improve the structural and quantitative analysis of complex samples.

  13. A Steady-State Kalman Predictor-Based Filtering Strategy for Non-Overlapping Sub-Band Spectral Estimation

    PubMed Central

    Li, Zenghui; Xu, Bin; Yang, Jian; Song, Jianshe

    2015-01-01

    This paper focuses on suppressing spectral overlap for sub-band spectral estimation, with which we can greatly decrease the computational complexity of existing spectral estimation algorithms, such as nonlinear least squares spectral analysis and non-quadratic regularized sparse representation. Firstly, our study shows that the nominal ability of the high-order analysis filter to suppress spectral overlap is greatly weakened when filtering a finite-length sequence, because many meaningless zeros are used as samples in convolution operations. Next, an extrapolation-based filtering strategy is proposed to produce a series of estimates as the substitutions of the zeros and to recover the suppression ability. Meanwhile, a steady-state Kalman predictor is applied to perform a linearly-optimal extrapolation. Finally, several typical methods for spectral analysis are applied to demonstrate the effectiveness of the proposed strategy. PMID:25609038

  14. Comparative Initial and Sustained Engagement in Web-based Training by Behavioral Healthcare Providers in New York State.

    PubMed

    Talley, Rachel; Chiang, I-Chin; Covell, Nancy H; Dixon, Lisa

    2018-06-01

    Improved dissemination is critical to implementation of evidence-based practice in community behavioral healthcare settings. Web-based training modalities are a promising strategy for dissemination of evidence-based practice in community behavioral health settings. Initial and sustained engagement of these modalities in large, multidisciplinary community provider samples is not well understood. This study evaluates comparative engagement and user preferences by provider type in a web-based training platform in a large, multidisciplinary community sample of behavioral health staff in New York State. Workforce make-up among platform registrants was compared to the general NYS behavioral health workforce. Training completion by functional job type was compared to characterize user engagement and preferences. Frequently completed modules were classified by credit and requirement incentives. High initial training engagement across professional role was demonstrated, with significant differences in initial and sustained engagement by professional role. The most frequently completed modules across functional job types contained credit or requirement incentives. The analysis demonstrated that high engagement of a web-based training in a multidisciplinary provider audience can be achieved without tailoring content to specific professional roles. Overlap between frequently completed modules and incentives suggests a role for incentives in promoting engagement of web-based training. These findings further the understanding of strategies to promote large-scale dissemination of evidence-based practice in community behavioral health settings.

  15. Condom use: exploring verbal and non-verbal communication strategies among Latino and African American men and women.

    PubMed

    Zukoski, Ann P; Harvey, S Marie; Branch, Meredith

    2009-08-01

    A growing body of literature provides evidence of a link between communication with sexual partners and safer sexual practices, including condom use. More research is needed that explores the dynamics of condom communication including gender differences in initiation, and types of communication strategies. The overall objective of this study was to explore condom use and the dynamics surrounding condom communication in two distinct community-based samples of African American and Latino heterosexual couples at increased risk for HIV. Based on 122 in-depth interviews, 80% of women and 74% of men reported ever using a condom with their primary partner. Of those who reported ever using a condom with their current partner, the majority indicated that condom use was initiated jointly by men and women. In addition, about one-third of the participants reported that the female partner took the lead and let her male partner know she wanted to use a condom. A sixth of the sample reported that men initiated use. Although over half of the respondents used bilateral verbal strategies (reminding, asking and persuading) to initiate condom use, one-fourth used unilateral verbal strategies (commanding and threatening to withhold sex). A smaller number reported using non-verbal strategies involving condoms themselves (e.g. putting a condom on or getting condoms). The results suggest that interventions designed to improve condom use may need to include both members of a sexual dyad and focus on improving verbal and non-verbal communication skills of individuals and couples.

  16. Nuclear Magnetic Resonance Spectroscopy-Based Identification of Yeast.

    PubMed

    Himmelreich, Uwe; Sorrell, Tania C; Daniel, Heide-Marie

    2017-01-01

    Rapid and robust high-throughput identification of environmental, industrial, or clinical yeast isolates is important whenever relatively large numbers of samples need to be processed in a cost-efficient way. Nuclear magnetic resonance (NMR) spectroscopy generates complex data based on metabolite profiles, chemical composition and possibly on medium consumption, which can not only be used for the assessment of metabolic pathways but also for accurate identification of yeast down to the subspecies level. Initial results on NMR based yeast identification where comparable with conventional and DNA-based identification. Potential advantages of NMR spectroscopy in mycological laboratories include not only accurate identification but also the potential of automated sample delivery, automated analysis using computer-based methods, rapid turnaround time, high throughput, and low running costs.We describe here the sample preparation, data acquisition and analysis for NMR-based yeast identification. In addition, a roadmap for the development of classification strategies is given that will result in the acquisition of a database and analysis algorithms for yeast identification in different environments.

  17. A PVS Prover Strategy Package for Common Manipulations

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.

    2002-01-01

    Sequent manipulations for an interactive prover such as PVS can often be labor intensive. We describe an approach to tactic-based proving for improved interactive deduction in specialized domains. An experimental package of strategies (tactics) and support functions has been developed for PVS to reduce the tedium of arithmetic manipulation. Included are strategies aimed at algebraic simplification of real-valued expressions as well as term-access techniques applicable in arbitrary settings. The approach is general enough to serve in other mathematical domains and for provers other than PVS. This report presents the full set of arithmetic strategies and discusses how they are invoked within the prover. Included is a description of the extended expression notation for accessing terms as well as a substitution technique provided for higher-order strategies. Several sample proofs are displayed in full to show how the strategies might be used in practice.

  18. Population pharmacokinetic analysis of clopidogrel in healthy Jordanian subjects with emphasis optimal sampling strategy.

    PubMed

    Yousef, A M; Melhem, M; Xue, B; Arafat, T; Reynolds, D K; Van Wart, S A

    2013-05-01

    Clopidogrel is metabolized primarily into an inactive carboxyl metabolite (clopidogrel-IM) or to a lesser extent an active thiol metabolite. A population pharmacokinetic (PK) model was developed using NONMEM(®) to describe the time course of clopidogrel-IM in plasma and to design a sparse-sampling strategy to predict clopidogrel-IM exposures for use in characterizing anti-platelet activity. Serial blood samples from 76 healthy Jordanian subjects administered a single 75 mg oral dose of clopidogrel were collected and assayed for clopidogrel-IM using reverse phase high performance liquid chromatography. A two-compartment (2-CMT) PK model with first-order absorption and elimination plus an absorption lag-time was evaluated, as well as a variation of this model designed to mimic enterohepatic recycling (EHC). Optimal PK sampling strategies (OSS) were determined using WinPOPT based upon collection of 3-12 post-dose samples. A two-compartment model with EHC provided the best fit and reduced bias in C(max) (median prediction error (PE%) of 9.58% versus 12.2%) relative to the basic two-compartment model, AUC(0-24) was similar for both models (median PE% = 1.39%). The OSS for fitting the two-compartment model with EHC required the collection of seven samples (0.25, 1, 2, 4, 5, 6 and 12 h). Reasonably unbiased and precise exposures were obtained when re-fitting this model to a reduced dataset considering only these sampling times. A two-compartment model considering EHC best characterized the time course of clopidogrel-IM in plasma. Use of the suggested OSS will allow for the collection of fewer PK samples when assessing clopidogrel-IM exposures. Copyright © 2013 John Wiley & Sons, Ltd.

  19. River runoff estimates based on remotely sensed surface velocities

    NASA Astrophysics Data System (ADS)

    Grünler, Steffen; Stammer, Detlef; Romeiser, Roland

    2010-05-01

    One promising technique for river runoff estimates from space is the retrieval of surface currents on the basis of synthetic aperture radar along-track interferometry (ATI). The German satellite TerraSAR-X, which was launched in June 2007, will permit ATI measurements in an experimental mode. Based on numerical simulations, we present findings of a research project in which the potential of satellite measurements of various parameters with different temporal and spatial sampling characteristics is evaluated. A sampling strategy for river runoff estimates is developed. We address the achievable accuracy and limitations of such estimates for different local flow conditions at selected test site. High-resolution three-dimensional current fields in the Elbe river (Germany) from a numerical model are used as reference data set and input for simulations of a variety of possible measuring and data interpretation strategies to be evaluated. Addressing the problem of aliasing we removed tidal signals from the sampling data. Discharge estimates on the basis of measured surface current fields and river widths from TerraSAR-X are successfully simulated. The differences of the resulted net discharge estimate are between 30-55% for a required continuously observation period of one year. We discuss the applicability of the measuring strategies to a number of major rivers. Further we show results of runoff estimates by the retrieval of surface current fields by real TerraSAR-X ATI data (AS mode) for the Elbe river study area.

  20. fMRI capture of auditory hallucinations: Validation of the two-steps method.

    PubMed

    Leroy, Arnaud; Foucher, Jack R; Pins, Delphine; Delmaire, Christine; Thomas, Pierre; Roser, Mathilde M; Lefebvre, Stéphanie; Amad, Ali; Fovet, Thomas; Jaafari, Nemat; Jardri, Renaud

    2017-10-01

    Our purpose was to validate a reliable method to capture brain activity concomitant with hallucinatory events, which constitute frequent and disabling experiences in schizophrenia. Capturing hallucinations using functional magnetic resonance imaging (fMRI) remains very challenging. We previously developed a method based on a two-steps strategy including (1) multivariate data-driven analysis of per-hallucinatory fMRI recording and (2) selection of the components of interest based on a post-fMRI interview. However, two tests still need to be conducted to rule out critical pitfalls of conventional fMRI capture methods before this two-steps strategy can be adopted in hallucination research: replication of these findings on an independent sample and assessment of the reliability of the hallucination-related patterns at the subject level. To do so, we recruited a sample of 45 schizophrenia patients suffering from frequent hallucinations, 20 schizophrenia patients without hallucinations and 20 matched healthy volunteers; all participants underwent four different experiments. The main findings are (1) high accuracy in reporting unexpected sensory stimuli in an MRI setting; (2) good detection concordance between hypothesis-driven and data-driven analysis methods (as used in the two-steps strategy) when controlled unexpected sensory stimuli are presented; (3) good agreement of the two-steps method with the online button-press approach to capture hallucinatory events; (4) high spatial consistency of hallucinatory-related networks detected using the two-steps method on two independent samples. By validating the two-steps method, we advance toward the possible transfer of such technology to new image-based therapies for hallucinations. Hum Brain Mapp 38:4966-4979, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Educational and evaluation strategies in the training of physician specialists

    PubMed

    Gaona-Flores, Verónica Alejandra; Campos-Navarro, Luz Arcelia; Arenas-Osuna, Jesús; Alcalá-Martínez, Enrique

    2017-01-01

    Teaching strategies have been defined as procedures, means or resources that teachers used to promote meaningful learning. Identify teaching strategies and evaluation used by the professor with residents in tertiary hospitals health care. This is a cross-sectional study conducted with full, associate and assistant professors of various medical specialties. A questionnaire was applied to evaluate the strategies used by professors to teach and evaluate students. We included a sample of 90 professors in 35 medical specialties. The most frequent teaching activities were: organizing students to develop presentations on specific subjects, followed by asking questions on previously reviewed subjects, In terms of the strategies employed, the most frequent "always" option was applied to case analyses. The most frequent methods used for the evaluation of theoretical knowledge were: participation in class, topic presentation and exams. Teaching activities were primarily based on the presentation of specific topics by the residents. The most commonly used educational strategies were clinical case analyses followed by problem-based learning and the use of illustrations. Evaluation of the residents' performance in theory knowledge, hinged on class participation, presentation of assigned topics and exams. Copyright: © 2017 SecretarÍa de Salud

  2. Fault detection, isolation, and diagnosis of self-validating multifunctional sensors.

    PubMed

    Yang, Jing-Li; Chen, Yin-Sheng; Zhang, Li-Li; Sun, Zhen

    2016-06-01

    A novel fault detection, isolation, and diagnosis (FDID) strategy for self-validating multifunctional sensors is presented in this paper. The sparse non-negative matrix factorization-based method can effectively detect faults by using the squared prediction error (SPE) statistic, and the variables contribution plots based on SPE statistic can help to locate and isolate the faulty sensitive units. The complete ensemble empirical mode decomposition is employed to decompose the fault signals to a series of intrinsic mode functions (IMFs) and a residual. The sample entropy (SampEn)-weighted energy values of each IMFs and the residual are estimated to represent the characteristics of the fault signals. Multi-class support vector machine is introduced to identify the fault mode with the purpose of diagnosing status of the faulty sensitive units. The performance of the proposed strategy is compared with other fault detection strategies such as principal component analysis, independent component analysis, and fault diagnosis strategies such as empirical mode decomposition coupled with support vector machine. The proposed strategy is fully evaluated in a real self-validating multifunctional sensors experimental system, and the experimental results demonstrate that the proposed strategy provides an excellent solution to the FDID research topic of self-validating multifunctional sensors.

  3. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  4. Multiplexed MRM-Based Protein Quantitation Using Two Different Stable Isotope-Labeled Peptide Isotopologues for Calibration.

    PubMed

    LeBlanc, André; Michaud, Sarah A; Percy, Andrew J; Hardie, Darryl B; Yang, Juncong; Sinclair, Nicholas J; Proudfoot, Jillaine I; Pistawka, Adam; Smith, Derek S; Borchers, Christoph H

    2017-07-07

    When quantifying endogenous plasma proteins for fundamental and biomedical research - as well as for clinical applications - precise, reproducible, and robust assays are required. Targeted detection of peptides in a bottom-up strategy is the most common and precise mass spectrometry-based quantitation approach when combined with the use of stable isotope-labeled peptides. However, when measuring protein in plasma, the unknown endogenous levels prevent the implementation of the best calibration strategies, since no blank matrix is available. Consequently, several alternative calibration strategies are employed by different laboratories. In this study, these methods were compared to a new approach using two different stable isotope-labeled standard (SIS) peptide isotopologues for each endogenous peptide to be quantified, enabling an external calibration curve as well as the quality control samples to be prepared in pooled human plasma without interference from endogenous peptides. This strategy improves the analytical performance of the assay and enables the accuracy of the assay to be monitored, which can also facilitate method development and validation.

  5. "We're Changing Our Ways": Women's Coping Strategies for Obesity Risk-reducing Behaviors in American Indian Households.

    PubMed

    Gadhoke, Preety; Christiansen, Karina; Pardilla, Marla; Frick, Kevin; Gittelsohn, Joel

    2015-01-01

    This article reveals women caregivers' perceptions and coping strategies to improve households' food and physical activity habits. Results emerged from the pre-intervention formative research phase of a multi-site, multi-level obesity prevention pilot intervention on American Indian (AI) reservations. Using purposive sampling, 250 adults and children participated in qualitative research. Results reveal that having local institutional support was a key structural facilitator. 'Family connectedness' emerged as a key relational facilitator. Hegemony of systems, food deserts, transportation, and weather were key structural barriers; Childcare needs and time constraints were key relational barriers. Women's coping strategies included planning ahead, maximizing, apportioning, tempting healthy, and social support. Findings informed the development and implementation of a novel obesity prevention pilot intervention tailored for each participating AI community addressing culturally relevant messages, institutional policies, and programs. We conclude with future consideration for comparative, ethnicity-based, class-based, and gender-specific studies on women's coping strategies for household health behaviors.

  6. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  7. A methodological approach to improve the sexual health of vulnerable female populations: incentivized peer-recruitment and field-based STD testing.

    PubMed

    Roth, Alexis M; Rosenberger, Joshua G; Reece, Michael; Van Der Pol, Barbara

    2012-02-01

    Transactional sex has been associated with increased risk of adverse health outcomes, including sexually transmitted infections (STIs). Participants included female sex workers and men they recruited utilizing incentivized snowball sampling. Participants provided specimens for STI diagnostic testing and completed a semi-structured interview. Forty-four participants aged 19-65 were interviewed. Participants found self-sampling to be acceptable and overwhelmingly endorsed sampling outside of a clinic (90%) for reasons such as convenience, privacy, and lack of stigma. A substantial minority (38%) tested positive for at least one STI. Novel strategies may encourage sexual health care and prevent STIs among sex workers. High infection and screening acceptance rates across the sample suggests that individuals engaged in transactional sex would benefit from, and would be responsive to, community-based self-sampling for STI screening.

  8. Research on regularized mean-variance portfolio selection strategy with modified Roy safety-first principle.

    PubMed

    Atta Mills, Ebenezer Fiifi Emire; Yan, Dawen; Yu, Bo; Wei, Xinyuan

    2016-01-01

    We propose a consolidated risk measure based on variance and the safety-first principle in a mean-risk portfolio optimization framework. The safety-first principle to financial portfolio selection strategy is modified and improved. Our proposed models are subjected to norm regularization to seek near-optimal stable and sparse portfolios. We compare the cumulative wealth of our preferred proposed model to a benchmark, S&P 500 index for the same period. Our proposed portfolio strategies have better out-of-sample performance than the selected alternative portfolio rules in literature and control the downside risk of the portfolio returns.

  9. Uncertainty in the profitability of fertilizer management based on various sampling designs.

    NASA Astrophysics Data System (ADS)

    Muhammed, Shibu; Ben, Marchant; Webster, Richard; Milne, Alice; Dailey, Gordon; Whitmore, Andrew

    2016-04-01

    Many farmers sample their soil to measure the concentrations of plant nutrients, including phosphorus (P), so as to decide how much fertilizer to apply. Now that fertilizer can be applied at variable rates, farmers want to know whether maps of nutrient concentration made from grid samples or from field subdivisions (zones within their fields) are merited: do such maps lead to greater profit than would a single measurement on a bulked sample for each field when all costs are taken into account? We have examined the merits of grid-based and zone-based sampling strategies over single field-based averages using continuous spatial data on wheat yields at harvest in six fields in southern England and simulated concentrations of P in the soil. Features of the spatial variation in the yields provide predictions about which sampling scheme is likely to be most cost effective, but there is uncertainty associated with these predictions that must be communicated to farmers. Where variograms of the yield have large variances and long effective ranges, grid-sampling and mapping nutrients are likely to be cost-effective. Where effective ranges are short, sampling must be dense to reveal the spatial variation and may be expensive. In these circumstances variable-rate application of fertilizer is likely to be impracticable and almost certainly not cost-effective. We have explored several methods for communicating these results and found that the most effective method was using probability maps that show the likelihood of grid-based and zone-based sampling being more profitable that a field-based estimate.

  10. The cost-effectiveness of cervical self-sampling to improve routine cervical cancer screening: The importance of respondent screening history and compliance

    PubMed Central

    Burger, Emily A; Sy, Stephen; Nygård, Mari; Kim, Jane J

    2016-01-01

    Background Human papillomavirus (HPV) testing allows women to self-collect cervico-vaginal cells at home (i.e., self-sampling). Using primary data from a randomized pilot study, we evaluated the long-term consequences and cost-effectiveness of using self-sampling to improve participation to routine cervical cancer screening in Norway. Methods We compared a strategy reflecting screening participation (using reminder letters) to strategies that involved mailing self-sampling device kits to women non-compliant to screening within a 5-year or 10-year period under two scenarios: A) self-sampling respondents had moderate under-screening histories, or B) respondents to self-sampling had moderate and severe under-screening histories. Model outcomes included quality-adjusted life-years (QALY) and lifetime costs. The ‘most cost-effective’ strategy was identified as the strategy just below $100,000 per QALY gained. Results Mailing self-sampling device kits to all women non-compliant to screening within a 5-year or 10-year period can be more effective and less costly than the current reminder letter policy; however, the optimal self-sampling strategy was dependent on the profile of self-sampling respondents. For example, ‘10-yearly self-sampling’ is preferred ($95,500 per QALY gained) if ‘5-yearly self-sampling’ could only attract moderate under-screeners; however, ‘5-yearly self-sampling’ is preferred if this strategy could additionally attract severe under-screeners. Conclusions Targeted self-sampling of non-compliers likely represents good value-for-money; however, the preferred strategy is contingent on the screening histories and compliance of respondents. Impact The magnitude of the health benefit and optimal self-sampling strategy is dependent on the profile and behavior of respondents. Health authorities should understand these factors prior to selecting and implementing a self-sampling policy. PMID:27624639

  11. Education strategies to foster health professional students' clinical reasoning skills.

    PubMed

    Rochmawati, Erna; Wiechula, Rick

    2010-06-01

    Clinical reasoning is an important skill for health professionals that should be developed to achieve high levels of expertise. Several education strategies have been suggested for implementation by health professional educators to foster their students' clinical reasoning skills. The strategies have included the following: problem-based learning, the integrative curriculum, reflection, and concept mapping. This review assesses which is the most effective education strategy for developing the clinical reasoning skills of health professional students. Four publications, from a total of 692 identified records, were included. Overall, this review was not able to make a final conclusion to answer the question. Therefore, there is a need to conduct more studies with larger samples and to undertake research that evaluates the following aspects: more alternate education interventions, variations in the delivery of education interventions, and the cost-effectiveness of implementing education strategies.

  12. Jaccard distance based weighted sparse representation for coarse-to-fine plant species recognition.

    PubMed

    Zhang, Shanwen; Wu, Xiaowei; You, Zhuhong

    2017-01-01

    Leaf based plant species recognition plays an important role in ecological protection, however its application to large and modern leaf databases has been a long-standing obstacle due to the computational cost and feasibility. Recognizing such limitations, we propose a Jaccard distance based sparse representation (JDSR) method which adopts a two-stage, coarse to fine strategy for plant species recognition. In the first stage, we use the Jaccard distance between the test sample and each training sample to coarsely determine the candidate classes of the test sample. The second stage includes a Jaccard distance based weighted sparse representation based classification(WSRC), which aims to approximately represent the test sample in the training space, and classify it by the approximation residuals. Since the training model of our JDSR method involves much fewer but more informative representatives, this method is expected to overcome the limitation of high computational and memory costs in traditional sparse representation based classification. Comparative experimental results on a public leaf image database demonstrate that the proposed method outperforms other existing feature extraction and SRC based plant recognition methods in terms of both accuracy and computational speed.

  13. Optimization of black-box models with uncertain climatic inputs—Application to sunflower ideotype design

    PubMed Central

    Picheny, Victor; Trépos, Ronan; Casadebaig, Pierre

    2017-01-01

    Accounting for the interannual climatic variations is a well-known issue for simulation-based studies of environmental systems. It often requires intensive sampling (e.g., averaging the simulation outputs over many climatic series), which hinders many sequential processes, in particular optimization algorithms. We propose here an approach based on a subset selection in a large basis of climatic series, using an ad-hoc similarity function and clustering. A non-parametric reconstruction technique is introduced to estimate accurately the distribution of the output of interest using only the subset sampling. The proposed strategy is non-intrusive and generic (i.e. transposable to most models with climatic data inputs), and can be combined to most “off-the-shelf” optimization solvers. We apply our approach to sunflower ideotype design using the crop model SUNFLO. The underlying optimization problem is formulated as a multi-objective one to account for risk-aversion. Our approach achieves good performances even for limited computational budgets, outperforming significantly standard strategies. PMID:28542198

  14. Recommended Mass Spectrometry-Based Strategies to Identify Ricin-Containing Samples.

    PubMed

    Kalb, Suzanne R; Schieltz, David M; Becher, François; Astot, Crister; Fredriksson, Sten-Åke; Barr, John R

    2015-11-25

    Ricin is a protein toxin produced by the castor bean plant (Ricinus communis) together with a related protein known as R. communis agglutinin (RCA120). Mass spectrometric (MS) assays have the capacity to unambiguously identify ricin and to detect ricin's activity in samples with complex matrices. These qualitative and quantitative assays enable detection and differentiation of ricin from the less toxic RCA120 through determination of the amino acid sequence of the protein in question, and active ricin can be monitored by MS as the release of adenine from the depurination of a nucleic acid substrate. In this work, we describe the application of MS-based methods to detect, differentiate and quantify ricin and RCA120 in nine blinded samples supplied as part of the EQuATox proficiency test. Overall, MS-based assays successfully identified all samples containing ricin or RCA120 with the exception of the sample spiked with the lowest concentration (0.414 ng/mL). In fact, mass spectrometry was the most successful method for differentiation of ricin and RCA120 based on amino acid determination. Mass spectrometric methods were also successful at ranking the functional activities of the samples, successfully yielding semi-quantitative results. These results indicate that MS-based assays are excellent techniques to detect, differentiate, and quantify ricin and RCA120 in complex matrices.

  15. Efficient identification of context dependent subgroups of risk from genome wide association studies

    PubMed Central

    Dyson, Greg; Sing, Charles F.

    2014-01-01

    We have developed a modified Patient Rule-Induction Method (PRIM) as an alternative strategy for analyzing representative samples of non-experimental human data to estimate and test the role of genomic variations as predictors of disease risk in etiologically heterogeneous sub-samples. A computational limit of the proposed strategy is encountered when the number of genomic variations (predictor variables) under study is large (> 500) because permutations are used to generate a null distribution to test the significance of a term (defined by values of particular variables) that characterizes a sub-sample of individuals through the peeling and pasting processes. As an alternative, in this paper we introduce a theoretical strategy that facilitates the quick calculation of Type I and Type II errors in the evaluation of terms in the peeling and pasting processes carried out in the execution of a PRIM analysis that are underestimated and non-existent, respectively, when a permutation-based hypothesis test is employed. The resultant savings in computational time makes possible the consideration of larger numbers of genomic variations (an example genome wide association study is given) in the selection of statistically significant terms in the formulation of PRIM prediction models. PMID:24570412

  16. Application of an Optimal Search Strategy for the DNAPL Source Identification to a Field Site in Nanjing, China

    NASA Astrophysics Data System (ADS)

    Longting, M.; Ye, S.; Wu, J.

    2014-12-01

    Identification and removing the DNAPL source in aquifer system is vital in rendering remediation successful and lowering the remediation time and cost. Our work is to apply an optimal search strategy introduced by Zoi and Pinder[1], with some modifications, to a field site in Nanjing City, China to define the strength, and location of DNAPL sources using the least samples. The overall strategy uses Monte Carlo stochastic groundwater flow and transport modeling, incorporates existing sampling data into the search strategy, and determines optimal sampling locations that are selected according to the reduction in overall uncertainty of the field and the proximity to the source locations. After a sample is taken, the plume is updated using a Kalman filter. The updated plume is then compared to the concentration fields that emanate from each individual potential source using fuzzy set technique. The comparison followed provides weights that reflect the degree of truth regarding the location of the source. The above steps are repeated until the optimal source characteristics are determined. Considering our site case, some specific modifications and work have been done as follows. K random fields are generated after fitting the measurement K data to the variogram model. The locations of potential sources that are given initial weights are targeted based on the field survey, with multiple potential source locations around the workshops and wastewater basin. Considering the short history (1999-2010) of manufacturing optical brightener PF at the site, and the existing sampling data, a preliminary source strength is then estimated, which will be optimized by simplex method or GA later. The whole algorithm then will guide us for optimal sampling and update as the investigation proceeds, until the weights finally stabilized. Reference [1] Dokou Zoi, and George F. Pinder. "Optimal search strategy for the definition of a DNAPL source." Journal of Hydrology 376.3 (2009): 542-556. Acknowledgement: Funding supported by National Natural Science Foundation of China (No. 41030746, 40872155) and DuPont Company is appreciated.

  17. Quality of Artemisinin-Based Combination Formulations for Malaria Treatment: Prevalence and Risk Factors for Poor Quality Medicines in Public Facilities and Private Sector Drug Outlets in Enugu, Nigeria

    PubMed Central

    Kaur, Harparkash; Allan, Elizabeth Louise; Mamadu, Ibrahim; Hall, Zoe; Ibe, Ogochukwu; El Sherbiny, Mohamed; van Wyk, Albert; Yeung, Shunmay; Swamidoss, Isabel; Green, Michael D.; Dwivedi, Prabha; Culzoni, Maria Julia; Clarke, Siân; Schellenberg, David; Fernández, Facundo M.; Onwujekwe, Obinna

    2015-01-01

    Background Artemisinin-based combination therapies are recommended by the World Health Organisation (WHO) as first-line treatment for Plasmodium falciparum malaria, yet medication must be of good quality for efficacious treatment. A recent meta-analysis reported 35% (796/2,296) of antimalarial drug samples from 21 Sub-Saharan African countries, purchased from outlets predominantly using convenience sampling, failed chemical content analysis. We used three sampling strategies to purchase artemisinin-containing antimalarials (ACAs) in Enugu metropolis, Nigeria, and compared the resulting quality estimates. Methods ACAs were purchased using three sampling approaches - convenience, mystery clients and overt, within a defined area and sampling frame in Enugu metropolis. The active pharmaceutical ingredients were assessed using high-performance liquid chromatography and confirmed by mass spectrometry at three independent laboratories. Results were expressed as percentage of APIs stated on the packaging and used to categorise each sample as acceptable quality, substandard, degraded, or falsified. Results Content analysis of 3024 samples purchased from 421 outlets using convenience (n=200), mystery (n=1,919) and overt (n=905) approaches, showed overall 90.8% ACAs to be of acceptable quality, 6.8% substandard, 1.3% degraded and 1.2% falsified. Convenience sampling yielded a significantly higher prevalence of poor quality ACAs, but was not evident by the mystery and overt sampling strategies both of which yielded results that were comparable between each other. Artesunate (n=135; 4 falsified) and dihydroartemisinin (n=14) monotherapy tablets, not recommended by WHO, were also identified. Conclusion Randomised sampling identified fewer falsified ACAs than previously reported by convenience approaches. Our findings emphasise the need for specific consideration to be given to sampling frame and sampling approach if representative information on drug quality is to be obtained. PMID:26018221

  18. Strategies to Improve Activity Recognition Based on Skeletal Tracking: Applying Restrictions Regarding Body Parts and Similarity Boundaries †

    PubMed Central

    Gutiérrez-López-Franca, Carlos; Hervás, Ramón; Johnson, Esperanza

    2018-01-01

    This paper aims to improve activity recognition systems based on skeletal tracking through the study of two different strategies (and its combination): (a) specialized body parts analysis and (b) stricter restrictions for the most easily detectable activities. The study was performed using the Extended Body-Angles Algorithm, which is able to analyze activities using only a single key sample. This system allows to select, for each considered activity, which are its relevant joints, which makes it possible to monitor the body of the user selecting only a subset of the same. But this feature of the system has both advantages and disadvantages. As a consequence, in the past we had some difficulties with the recognition of activities that only have a small subset of the joints of the body as relevant. The goal of this work, therefore, is to analyze the effect produced by the application of several strategies on the results of an activity recognition system based on skeletal tracking joint oriented devices. Strategies that we applied with the purpose of improve the recognition rates of the activities with a small subset of relevant joints. Through the results of this work, we aim to give the scientific community some first indications about which considered strategy is better. PMID:29789478

  19. Combining Metabolic ¹⁵N Labeling with Improved Tandem MOAC for Enhanced Probing of the Phosphoproteome.

    PubMed

    Thomas, Martin; Huck, Nicola; Hoehenwarter, Wolfgang; Conrath, Uwe; Beckers, Gerold J M

    2015-01-01

    In eukaryotic cells many diverse cellular functions are regulated by reversible protein phosphorylation. In recent years, phosphoproteomics has become a powerful tool for studying protein phosphorylation because it enables unbiased localization, and site-specific quantification of in vivo phosphorylation of hundreds of proteins in a single experiment. A common strategy for identifying phosphoproteins and their phosphorylation sites from complex biological samples is the enrichment of phosphopeptides from digested cellular lysates followed by mass spectrometry. However, despite high sensitivity of modern mass spectrometers the large dynamic range of protein abundance and the transient nature of protein phosphorylation remained major pitfalls in MS-based phosphoproteomics. This is particularly true for plants in which the presence of secondary metabolites and endogenous compounds, the overabundance of ribulose-1,5-bisphosphate carboxylase and other components of the photosynthetic apparatus, and the concurrent difficulties in protein extraction necessitate two-step phosphoprotein/phosphopeptide enrichment strategies (Nakagami et al., Plant Cell Physiol 53:118-124, 2012).Approaches for label-free peptide quantification are advantageous due to their low cost and experimental simplicity, but they lack precision. These drawbacks can be overcome by metabolic labeling of whole plants with heavy nitrogen ((15)N) which allows combining two samples very early in the phosphoprotein enrichment workflow. This avoids sample-to-sample variation introduced by the analytical procedures and it results in robust relative quantification values that need no further standardization. The integration of (15)N metabolic labeling into tandem metal-oxide affinity chromatography (MOAC) (Hoehenwarter et al., Mol Cell Proteomics 12:369-380, 2013) presents an improved and highly selective approach for the identification and accurate site-specific quantification of low-abundance phosphoproteins that is based on the successive enrichment of light and heavy nitrogen-labeled phosphoproteins and peptides. This improved strategy combines metabolic labeling of whole plants with the stable heavy nitrogen isotope ((15)N), protein extraction under denaturing conditions, phosphoprotein enrichment using Al(OH)3-based MOAC, and tryptic digest of enriched phosphoproteins followed by TiO2-based MOAC of phosphopeptides and quantitative phosphopeptide measurement by liquid chromatography (LC) and high-resolution accurate mass (HR/AM) mass spectrometry (MS). Thus, tandem MOAC effectively targets the phosphate moiety of phosphoproteins and phosphopeptides and allows probing of the phosphoproteome to unprecedented depth, while (15)N metabolic labeling enables accurate relative quantification of measured peptides and direct comparison between samples.

  20. Data analysis strategies for reducing the influence of the bias in cross-cultural research.

    PubMed

    Sindik, Josko

    2012-03-01

    In cross-cultural research, researchers have to adjust the constructs and associated measurement instruments that have been developed in one culture and then imported for use in another culture. Importing concepts from other cultures is often simply reduced to language adjustment of the content in the items of the measurement instruments that define a certain (psychological) construct. In the context of cross-cultural research, test bias can be defined as a generic term for all nuisance factors that threaten the validity of cross-cultural comparisons. Bias can be an indicator that instrument scores based on the same items measure different traits and characteristics across different cultural groups. To reduce construct, method and item bias,the researcher can consider these strategies: (1) simply comparing average results in certain measuring instruments; (2) comparing only the reliability of certain dimensions of the measurement instruments, applied to the "target" and "source" samples of participants, i.e. from different cultures; (3) comparing the "framed" factor structure (fixed number of factors) of the measurement instruments, applied to the samples from the "target" and "source" cultures, using explorative factor analysis strategy on separate samples; (4) comparing the complete constructs ("unframed" factor analysis, i.e. unlimited number of factors) in relation to their best psychometric properties and the possibility of interpreting (best suited to certain cultures, applying explorative strategy of factor analysis); or (5) checking the similarity of the constructs in the samples from different cultures (using structural equation modeling approach). Each approach has its advantages and disadvantages. The advantages and lacks of each approach are discussed.

  1. The influence of cognitive load on transfer with error prevention training methods: a meta-analysis.

    PubMed

    Hutchins, Shaun D; Wickens, Christopher D; Carolan, Thomas F; Cumming, John M

    2013-08-01

    The objective was to conduct research synthesis for the U.S.Army on the effectiveness of two error prevention training strategies (training wheels and scaffolding) on the transfer of training. Motivated as part of an ongoing program of research on training effectiveness, the current work presents some of the program's research into the effects on transfer of error prevention strategies during training from a cognitive load perspective. Based on cognitive load theory, two training strategies were hypothesized to reduce intrinsic load by supporting learners early in acquisition during schema development. A transfer ratio and Hedges' g were used in the two meta-analyses conducted on transfer studies employing the two training strategies. Moderators relevant to cognitive load theory and specific to the implemented strategies were examined.The transfer ratio was the ratio of treatment transfer performance to control transfer. Hedges' g was used in comparing treatment and control group standardized mean differences. Both effect sizes were analyzed with versions of sample weighted fixed effect models. Analysis of the training wheels strategy suggests a transfer benefit. The observed benefit was strongest when the training wheels were a worked example coupled with a principle-based prompt. Analysis of the scaffolding data also suggests a transfer benefit for the strategy. Both training wheels and scaffolding demonstrated positive transfer as training strategies.As error prevention techniques, both support the intrinsic load--reducing implications of cognitive load theory. The findings are applicable to the development of instructional design guidelines in professional skill-based organizations such as the military.

  2. Downselection for Sample Return — Defining Sampling Strategies Using Lessons from Terrestrial Field Analogues

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Gentry, D.; Amador, E.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Jacobsen, M.; Kirby, J.; McCaig, H.; Murukesan, G.; Rader, E.; Rennie, V.; Schwieterman, E.; Sutton, S.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.; Stockton, A.

    2018-04-01

    We detail multi-year field investigations in Icelandic Mars analogue environments that have yielded results that can help inform strategies for sample selection and downselection for Mars Sample Return.

  3. An integrative strategy for quantitative analysis of the N-glycoproteome in complex biological samples

    PubMed Central

    2014-01-01

    Background The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. Results The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10–10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023–0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. Conclusions This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes. PMID:24428921

  4. Plant iTRAQ-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Handakumbura, Pubudu; Hixson, Kim K.; Purvine, Samuel O.

    We present a simple one-­pot extraction protocol, which rapidly isolates hydrophyllic metabolites, lipids, and proteins from the same pulverized plant sample. Also detailed is a global plant proteomics sample preparation method utilizing iTRAQ multiplexing reagents that enables deep proteome coverage due to the use of HPLC fractionation of the peptides prior to mass spectrometric analysis. We have successfully used this protocol on several different plant tissues (e.g., roots, stems, leaves) from different plants (e.g., sorghum, poplar, Arabidopsis, soybean), and have been able to successfully detect and quantify thousands of proteins. Multiplexing strategies such as iTRAQ and the bioinformatics strategy outlinedmore » here, ultimately provide insight into which proteins are significantly changed in abundance between two or more groups (e.g., control, perturbation). Our bioinformatics strategy yields z-­score values, which normalize the expression data into a format that can easily be cross-­compared with other expression data (i.e., metabolomics, transcriptomics) obtained from different analytical methods and instrumentation.« less

  5. Video-feedback intervention increases sensitive parenting in ethnic minority mothers: a randomized control trial.

    PubMed

    Yagmur, Sengul; Mesman, Judi; Malda, Maike; Bakermans-Kranenburg, Marian J; Ekmekci, Hatice

    2014-01-01

    Using a randomized control trial design we tested the effectiveness of a culturally sensitive adaptation of the Video-feedback Intervention to promote Positive Parenting and Sensitive Discipline (VIPP-SD) in a sample of 76 Turkish minority families in the Netherlands. The VIPP-SD was adapted based on a pilot with feedback of the target mothers, resulting in the VIPP-TM (VIPP-Turkish Minorities). The sample included families with 20-47-month-old children with high levels of externalizing problems. Maternal sensitivity, nonintrusiveness, and discipline strategies were observed during pretest and posttest home visits. The VIPP-TM was effective in increasing maternal sensitivity and nonintrusiveness, but not in enhancing discipline strategies. Applying newly learned sensitivity skills in discipline situations may take more time, especially in a cultural context that favors more authoritarian strategies. We conclude that the VIPP-SD program and its video-feedback approach can be successfully applied in immigrant families with a non-Western cultural background, with demonstrated effects on parenting sensitivity and nonintrusiveness.

  6. A PCR-based tool for the cultivation-independent monitoring of Pandora neoaphidis.

    PubMed

    Fournier, A; Enkerli, J; Keller, S; Widmer, F

    2008-09-01

    Pandora neoaphidis is one of the most important fungal pathogens of aphids and has a great potential for use in biocontrol. Little is known on how this fungus persists in an area and in particular on its overwintering strategies. It is hypothesized that natural areas play an important role for survival and that soil may serve as a source of inoculum for new aphid populations in spring. To test these hypotheses, a cultivation-independent PCR-based diagnostic tool was developed, that allows the detection of P. neoaphidis in the environment. Two P. neoaphidis specific PCR primer pairs were designed, targeting sequences in the ribosomal RNA gene cluster. Specificity of both primer pairs was demonstrated with P. neoaphidis and non-target close entomophthoralean relatives. Moreover, single amplicons of expected sizes were obtained with both primer pairs from various environmental sample types, including aphid cadavers, plant material, and soil. The PCR-based diagnostic tool was applied to investigate the persistence of P. neoaphidis in soil samples obtained in 2004/2005 from a nettle field harboring infected aphids in fall 2004. P. neoaphidis was detected in every sample collected in November 2004 and March 2005, suggesting an overwintering stage of P. neoaphidis in top soil layers. The developed cultivation-independent PCR-based tool will be valuable for further investigation of the ecology of P. neoaphidis and for the development and future implementation of management strategies against aphids involving conservation biocontrol.

  7. Effects of 16S rDNA sampling on estimates of the number of endosymbiont lineages in sucking lice

    PubMed Central

    Burleigh, J. Gordon; Light, Jessica E.; Reed, David L.

    2016-01-01

    Phylogenetic trees can reveal the origins of endosymbiotic lineages of bacteria and detect patterns of co-evolution with their hosts. Although taxon sampling can greatly affect phylogenetic and co-evolutionary inference, most hypotheses of endosymbiont relationships are based on few available bacterial sequences. Here we examined how different sampling strategies of Gammaproteobacteria sequences affect estimates of the number of endosymbiont lineages in parasitic sucking lice (Insecta: Phthirapatera: Anoplura). We estimated the number of louse endosymbiont lineages using both newly obtained and previously sequenced 16S rDNA bacterial sequences and more than 42,000 16S rDNA sequences from other Gammaproteobacteria. We also performed parametric and nonparametric bootstrapping experiments to examine the effects of phylogenetic error and uncertainty on these estimates. Sampling of 16S rDNA sequences affects the estimates of endosymbiont diversity in sucking lice until we reach a threshold of genetic diversity, the size of which depends on the sampling strategy. Sampling by maximizing the diversity of 16S rDNA sequences is more efficient than randomly sampling available 16S rDNA sequences. Although simulation results validate estimates of multiple endosymbiont lineages in sucking lice, the bootstrap results suggest that the precise number of endosymbiont origins is still uncertain. PMID:27547523

  8. Determining best practices in reconnoitering sites for habitability potential on Mars using a semi-autonomous rover: A GeoHeuristic Operational Strategies Test.

    PubMed

    Yingst, R A; Berger, J; Cohen, B A; Hynek, B; Schmidt, M E

    2017-03-01

    We tested science operations strategies developed for use in remote mobile spacecraft missions, to determine whether reconnoitering a site of potential habitability prior to in-depth study (a walkabout-first strategy) can be a more efficient use of time and resources than the linear approach commonly used by planetary rover missions. Two field teams studied a sedimentary sequence in Utah to assess habitability potential. At each site one team commanded a human "rover" to execute observations and conducted data analysis and made follow-on decisions based solely on those observations. Another team followed the same traverse using traditional terrestrial field methods, and the results of the two teams were compared. Test results indicate that for a mission with goals similar to our field case, the walkabout-first strategy may save time and other mission resources, while improving science return. The approach enabled more informed choices and higher team confidence in choosing where to spend time and other consumable resources. The walkabout strategy may prove most efficient when many close sites must be triaged to a smaller subset for detailed study or sampling. This situation would arise when mission goals include finding, identifying, characterizing or sampling a specific material, feature or type of environment within a certain area.

  9. California desert resource inventory using multispectral classification of digitally mosaicked Landsat frames

    NASA Technical Reports Server (NTRS)

    Bryant, N. A.; Mcleod, R. G.; Zobrist, A. L.; Johnson, H. B.

    1979-01-01

    Procedures for adjustment of brightness values between frames and the digital mosaicking of Landsat frames to standard map projections are developed for providing a continuous data base for multispectral thematic classification. A combination of local terrain variations in the Californian deserts and a global sampling strategy based on transects provided the framework for accurate classification throughout the entire geographic region.

  10. Steer-PROP: a GRASE-PROPELLER sequence with interecho steering gradient pulses.

    PubMed

    Srinivasan, Girish; Rangwala, Novena; Zhou, Xiaohong Joe

    2018-05-01

    This study demonstrates a novel PROPELLER (periodically rotated overlapping parallel lines with enhanced reconstruction) pulse sequence, termed Steer-PROP, based on gradient and spin echo (GRASE), to reduce the imaging times and address phase errors inherent to GRASE. The study also illustrates the feasibility of using Steer-PROP as an alternative to single-shot echo planar imaging (SS-EPI) to produce distortion-free diffusion images in all imaging planes. Steer-PROP uses a series of blip gradient pulses to produce N (N = 3-5) adjacent k-space blades in each repetition time, where N is the number of gradient echoes in a GRASE sequence. This sampling strategy enables a phase correction algorithm to systematically address the GRASE phase errors as well as the motion-induced phase inconsistency. Steer-PROP was evaluated on phantoms and healthy human subjects at both 1.5T and 3.0T for T 2 - and diffusion-weighted imaging. Steer-PROP produced similar image quality as conventional PROPELLER based on fast spin echo (FSE), while taking only a fraction (e.g., 1/3) of the scan time. The robustness against motion in Steer-PROP was comparable to that of FSE-based PROPELLER. Using Steer-PROP, high quality and distortion-free diffusion images were obtained from human subjects in all imaging planes, demonstrating a considerable advantage over SS-EPI. The proposed Steer-PROP sequence can substantially reduce the scan times compared with FSE-based PROPELLER while achieving adequate image quality. The novel k-space sampling strategy in Steer-PROP not only enables an integrated phase correction method that addresses various sources of phase errors, but also minimizes the echo spacing compared with alternative sampling strategies. Steer-PROP can also be a viable alternative to SS-EPI to decrease image distortion in all imaging planes. Magn Reson Med 79:2533-2541, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  11. Efficient sampling of complex network with modified random walk strategies

    NASA Astrophysics Data System (ADS)

    Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei

    2018-02-01

    We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.

  12. An evaluation of sampling and full enumeration strategies for Fisher Jenks classification in big data settings

    USGS Publications Warehouse

    Rey, Sergio J.; Stephens, Philip A.; Laura, Jason R.

    2017-01-01

    Large data contexts present a number of challenges to optimal choropleth map classifiers. Application of optimal classifiers to a sample of the attribute space is one proposed solution. The properties of alternative sampling-based classification methods are examined through a series of Monte Carlo simulations. The impacts of spatial autocorrelation, number of desired classes, and form of sampling are shown to have significant impacts on the accuracy of map classifications. Tradeoffs between improved speed of the sampling approaches and loss of accuracy are also considered. The results suggest the possibility of guiding the choice of classification scheme as a function of the properties of large data sets.

  13. Colorimetric and Fluorescent Dual Mode Sensing of Alcoholic Strength in Spirit Samples with Stimuli-Responsive Infinite Coordination Polymers.

    PubMed

    Deng, Jingjing; Ma, Wenjie; Yu, Ping; Mao, Lanqun

    2015-07-07

    This study demonstrates a new strategy for colorimetric and fluorescent dual mode sensing of alcoholic strength (AS) in spirit samples based on stimuli-responsive infinite coordination polymers (ICPs). The ICP supramolecular network is prepared with 1,4-bis(imidazol-1-ylmethyl)benzene (bix) as the ligand and Zn(2+) as the central metal ion in ethanol, in which rhodamine B (RhB) is encapsulated through self-adaptive chemistry. In pure ethanol solvent, the as-formed RhB/Zn(bix) is well dispersed and quite stable. However, the addition of water into the ethanol dispersion of RhB/Zn(bix) destroys Zn(bix) network structure, resulting in the release of RhB from ICP into the solvent. As a consequence, the solvent displays the color of released RhB and, at the meantime, turns on the fluorescence of RhB, which constitutes a new mechanism for colorimetric and fluorescent dual mode sensing of AS in commercial spirit samples. With the method developed here, we could distinguish the AS of different commercial spirit samples by the naked eye within a wide linear range from 20 to 100% vol and by monitoring the increase of fluorescent intensity of the released RhB. This study not only offers a new method for on-spot visible detection of AS in commercial spirit samples, but also provides a strategy for designing dual mode sensing mechanisms for different analytical purposes based on novel stimuli-responsive materials.

  14. Applying active learning to supervised word sense disambiguation in MEDLINE.

    PubMed

    Chen, Yukun; Cao, Hongxin; Mei, Qiaozhu; Zheng, Kai; Xu, Hua

    2013-01-01

    This study was to assess whether active learning strategies can be integrated with supervised word sense disambiguation (WSD) methods, thus reducing the number of annotated samples, while keeping or improving the quality of disambiguation models. We developed support vector machine (SVM) classifiers to disambiguate 197 ambiguous terms and abbreviations in the MSH WSD collection. Three different uncertainty sampling-based active learning algorithms were implemented with the SVM classifiers and were compared with a passive learner (PL) based on random sampling. For each ambiguous term and each learning algorithm, a learning curve that plots the accuracy computed from the test set as a function of the number of annotated samples used in the model was generated. The area under the learning curve (ALC) was used as the primary metric for evaluation. Our experiments demonstrated that active learners (ALs) significantly outperformed the PL, showing better performance for 177 out of 197 (89.8%) WSD tasks. Further analysis showed that to achieve an average accuracy of 90%, the PL needed 38 annotated samples, while the ALs needed only 24, a 37% reduction in annotation effort. Moreover, we analyzed cases where active learning algorithms did not achieve superior performance and identified three causes: (1) poor models in the early learning stage; (2) easy WSD cases; and (3) difficult WSD cases, which provide useful insight for future improvements. This study demonstrated that integrating active learning strategies with supervised WSD methods could effectively reduce annotation cost and improve the disambiguation models.

  15. A human fecal contamination score for ranking recreational sites using the HF183/BacR287 quantitative real-time PCR method.

    PubMed

    Cao, Yiping; Sivaganesan, Mano; Kelty, Catherine A; Wang, Dan; Boehm, Alexandria B; Griffith, John F; Weisberg, Stephen B; Shanks, Orin C

    2018-01-01

    Human fecal pollution of recreational waters remains a public health concern worldwide. As a result, there is a growing interest in the application of human-associated fecal source identification quantitative real-time PCR (qPCR) technologies for water quality research and management. However, there are currently no standardized approaches for field implementation and interpretation of qPCR data. In this study, a standardized HF183/BacR287 qPCR method was combined with a water sampling strategy and a novel Bayesian weighted average approach to establish a human fecal contamination score (HFS) that can be used to prioritize sampling sites for remediation based on measured human waste levels. The HFS was then used to investigate 975 study design scenarios utilizing different combinations of sites with varying sampling intensities (daily to once per week) and number of qPCR replicates per sample (2-14 replicates). Findings demonstrate that site prioritization with HFS is feasible and that both sampling intensity and number of qPCR replicates influence reliability of HFS estimates. The novel data analysis strategy presented here provides a prescribed approach for the implementation and interpretation of human-associated HF183/BacR287 qPCR data with the goal of site prioritization based on human fecal pollution levels. In addition, information is provided for future users to customize study designs for optimal HFS performance. Published by Elsevier Ltd.

  16. Applying active learning to supervised word sense disambiguation in MEDLINE

    PubMed Central

    Chen, Yukun; Cao, Hongxin; Mei, Qiaozhu; Zheng, Kai; Xu, Hua

    2013-01-01

    Objectives This study was to assess whether active learning strategies can be integrated with supervised word sense disambiguation (WSD) methods, thus reducing the number of annotated samples, while keeping or improving the quality of disambiguation models. Methods We developed support vector machine (SVM) classifiers to disambiguate 197 ambiguous terms and abbreviations in the MSH WSD collection. Three different uncertainty sampling-based active learning algorithms were implemented with the SVM classifiers and were compared with a passive learner (PL) based on random sampling. For each ambiguous term and each learning algorithm, a learning curve that plots the accuracy computed from the test set as a function of the number of annotated samples used in the model was generated. The area under the learning curve (ALC) was used as the primary metric for evaluation. Results Our experiments demonstrated that active learners (ALs) significantly outperformed the PL, showing better performance for 177 out of 197 (89.8%) WSD tasks. Further analysis showed that to achieve an average accuracy of 90%, the PL needed 38 annotated samples, while the ALs needed only 24, a 37% reduction in annotation effort. Moreover, we analyzed cases where active learning algorithms did not achieve superior performance and identified three causes: (1) poor models in the early learning stage; (2) easy WSD cases; and (3) difficult WSD cases, which provide useful insight for future improvements. Conclusions This study demonstrated that integrating active learning strategies with supervised WSD methods could effectively reduce annotation cost and improve the disambiguation models. PMID:23364851

  17. Sampling strategies for estimating acute and chronic exposures of pesticides in streams

    USGS Publications Warehouse

    Crawford, Charles G.

    2004-01-01

    The Food Quality Protection Act of 1996 requires that human exposure to pesticides through drinking water be considered when establishing pesticide tolerances in food. Several systematic and seasonally weighted systematic sampling strategies for estimating pesticide concentrations in surface water were evaluated through Monte Carlo simulation, using intensive datasets from four sites in northwestern Ohio. The number of samples for the strategies ranged from 4 to 120 per year. Sampling strategies with a minimal sampling frequency outside the growing season can be used for estimating time weighted mean and percentile concentrations of pesticides with little loss of accuracy and precision, compared to strategies with the same sampling frequency year round. Less frequent sampling strategies can be used at large sites. A sampling frequency of 10 times monthly during the pesticide runoff period at a 90 km 2 basin and four times monthly at a 16,400 km2 basin provided estimates of the time weighted mean, 90th, 95th, and 99th percentile concentrations that fell within 50 percent of the true value virtually all of the time. By taking into account basin size and the periodic nature of pesticide runoff, costs of obtaining estimates of time weighted mean and percentile pesticide concentrations can be minimized.

  18. The biogeochemical structuring role of horizontal stirring: Lagrangian perspectives on iron delivery downstream of the Kerguelen Plateau

    NASA Astrophysics Data System (ADS)

    d'Ovidio, F.; Della Penna, A.; Trull, T. W.; Nencioli, F.; Pujol, M.-I.; Rio, M.-H.; Park, Y.-H.; Cotté, C.; Zhou, M.; Blain, S.

    2015-10-01

    Field campaigns are instrumental in providing ground truth for understanding and modeling global ocean biogeochemical budgets. A survey however can only inspect a fraction of the global oceans, typically a region hundreds of kilometers wide for a temporal window of the order of (at most) several weeks. This spatiotemporal domain is also the one in which the mesoscale activity induces through horizontal stirring a strong variability in the biogeochemical tracers, with ephemeral, local contrasts which can easily mask the regional and seasonal gradients. Therefore, whenever local in situ measures are used to infer larger-scale budgets, one faces the challenge of identifying the mesoscale structuring effect, if not simply to filter it out. In the case of the KEOPS2 investigation of biogeochemical responses to natural iron fertilization, this problem was tackled by designing an adaptive sampling strategy based on regionally optimized multisatellite products analyzed in real time by specifically designed Lagrangian diagnostics. This strategy identified the different mesoscale and stirring structures present in the region and tracked the dynamical frontiers among them. It also enabled back trajectories for the ship-sampled stations to be estimated, providing important insights into the timing and pathways of iron supply, which were explored further using a model based on first-order iron removal. This context was essential for the interpretation of the field results. The mesoscale circulation-based strategy was also validated post-cruise by comparing the Lagrangian maps derived from satellites with the patterns of more than one hundred drifters, including some adaptively released during KEOPS2 and a subsequent research voyage. The KEOPS2 strategy was adapted to the specific biogeochemical characteristics of the region, but its principles are general and will be useful for future in situ biogeochemical surveys.

  19. The biogeochemical structuring role of horizontal stirring: Lagrangian perspectives on iron delivery downstream of the Kerguelen plateau

    NASA Astrophysics Data System (ADS)

    d'Ovidio, F.; Della Penna, A.; Trull, T. W.; Nencioli, F.; Pujol, I.; Rio, M. H.; Park, Y.-H.; Cotté, C.; Zhou, M.; Blain, S.

    2015-01-01

    Field campaigns are instrumental in providing ground truth for understanding and modelling global ocean biogeochemical budgets. A survey however can only inspect a fraction of the global oceans, typically a region 100s km wide for a temporal window of the order of (at most) several weeks. This spatiotemporal domain is also the one in which the mesoscale activity induces through horizontal stirring a strong variability in the biogeochemical tracers, with ephemeral, local contrasts which can easily mask the regional and seasonal gradients. Therefore, whenever local in-situ measures are used to infer larger scale budgets one faces the challenge of identifying the mesoscale structuring effect, if not simply to filter it out. In the case of the KEOPS2 investigation of biogeochemical responses to natural iron fertilization, this problem was tackled by designing an adaptive sampling strategy based on regionally-optimized multisatellite products analyzed in real time by specifically designed Lagrangian diagnostics. This strategy identified the different mesoscale and stirring structures present in the region and tracked the dynamical frontiers among them. It also enabled back-trajectories for the ship sampled stations to be estimated, providing important insights into the timing and pathways of iron supply, which were explored further using model based on first order iron removal. This context was essential for the interpretation of the field results. The mesoscale circulation based strategy was also validated post-cruise by comparing the Lagrangian maps derived from satellite with the patterns of more than one hundred drifters adaptively released during KEOPS2 and a subsequent research voyage. The KEOPS2 strategy was adapted to the specific biogeochemical characteristics of the region, but its principles are general and will be useful for future in-situ biogeochemical surveys.

  20. A Novel Implementation Strategy in Residential Care Settings to Promote EBP: Direct Care Provider Perceptions and Development of a Conceptual Framework.

    PubMed

    Slaughter, Susan E; Bampton, Erin; Erin, Daniel F; Ickert, Carla; Jones, C Allyson; Estabrooks, Carole A

    2017-06-01

    Innovative approaches are required to facilitate the adoption and sustainability of evidence-based care practices. We propose a novel implementation strategy, a peer reminder role, which involves offering a brief formal reminder to peers during structured unit meetings. This study aims to (a) identify healthcare aide (HCA) perceptions of a peer reminder role for HCAs, and (b) develop a conceptual framework for the role based on these perceptions. In 2013, a qualitative focus group study was conducted in five purposively sampled residential care facilities in western Canada. A convenience sample of 24 HCAs agreed to participate in five focus groups. Concurrent with data collection, two researchers coded the transcripts and identified themes by consensus. They jointly determined when saturation was achieved and took steps to optimize the trustworthiness of the findings. Five HCAs from the original focus groups commented on the resulting conceptual framework. HCAs were cautious about accepting a role that might alienate them from their co-workers. They emphasized feeling comfortable with the peer reminder role and identified circumstances that would optimize their comfort including: effective implementation strategies, perceptions of the role, role credibility and a supportive context. These intersecting themes formed a peer reminder conceptual framework. We identified HCAs' perspectives of a new peer reminder role designed specifically for them. Based on their perceptions, a conceptual framework was developed to guide the implementation of a peer reminder role for HCAs. This role may be a strategic implementation strategy to optimize the sustainability of new practices in residential care settings, and the related framework could offer guidance on how to implement this role. © 2017 Sigma Theta Tau International.

  1. Designing HIV Testing Algorithms Based on 2015 WHO Guidelines Using Data from Six Sites in Sub-Saharan Africa

    PubMed Central

    Kosack, Cara S.; Shanks, Leslie; Beelaert, Greet; Benson, Tumwesigye; Savane, Aboubacar; Ng'ang'a, Anne; Bita, André; Zahinda, Jean-Paul B. N.; Fransen, Katrien

    2017-01-01

    ABSTRACT Our objective was to evaluate the performance of HIV testing algorithms based on WHO recommendations, using data from specimens collected at six HIV testing and counseling sites in sub-Saharan Africa (Conakry, Guinea; Kitgum and Arua, Uganda; Homa Bay, Kenya; Douala, Cameroon; Baraka, Democratic Republic of Congo). A total of 2,780 samples, including 1,306 HIV-positive samples, were included in the analysis. HIV testing algorithms were designed using Determine as a first test. Second and third rapid diagnostic tests (RDTs) were selected based on site-specific performance, adhering where possible to the WHO-recommended minimum requirements of ≥99% sensitivity and specificity. The threshold for specificity was reduced to 98% or 96% if necessary. We also simulated algorithms consisting of one RDT followed by a simple confirmatory assay. The positive predictive values (PPV) of the simulated algorithms ranged from 75.8% to 100% using strategies recommended for high-prevalence settings, 98.7% to 100% using strategies recommended for low-prevalence settings, and 98.1% to 100% using a rapid test followed by a simple confirmatory assay. Although we were able to design algorithms that met the recommended PPV of ≥99% in five of six sites using the applicable high-prevalence strategy, options were often very limited due to suboptimal performance of individual RDTs and to shared falsely reactive results. These results underscore the impact of the sequence of HIV tests and of shared false-reactivity data on algorithm performance. Where it is not possible to identify tests that meet WHO-recommended specifications, the low-prevalence strategy may be more suitable. PMID:28747371

  2. Gender-Differentiated Parenting Revisited: Meta-Analysis Reveals Very Few Differences in Parental Control of Boys and Girls.

    PubMed

    Endendijk, Joyce J; Groeneveld, Marleen G; Bakermans-Kranenburg, Marian J; Mesman, Judi

    2016-01-01

    Although various theories describe mechanisms leading to differential parenting of boys and girls, there is no consensus about the extent to which parents do treat their sons and daughters differently. The last meta-analyses on the subject were conducted more than fifteen years ago, and changes in gender-specific child rearing in the past decade are quite plausible. In the current set of meta-analyses, based on 126 observational studies (15,034 families), we examined mothers' and fathers' differential use of autonomy-supportive and controlling strategies with boys and girls, and the role of moderators related to the decade in which the study was conducted, the observational context, and sample characteristics. Databases of Web of Science, ERIC, PsychInfo, Online Contents, Picarta, and Proquest were searched for studies examining differences in observed parental control of boys and girls between the ages of 0 and 18 years. Few differences were found in parents' use of control with boys and girls. Parents were slightly more controlling with boys than with girls, but the effect size was negligible (d = 0.08). The effect was larger, but still small, in normative groups and in samples with younger children. No overall effect for gender-differentiated autonomy-supportive strategies was found (d = 0.03). A significant effect of time emerged: studies published in the 1970s and 1980s reported more autonomy-supportive strategies with boys than toward girls, but from 1990 onwards parents showed somewhat more autonomy-supportive strategies with girls than toward boys. Taking into account parents' gender stereotypes might uncover subgroups of families where gender-differentiated control is salient, but based on our systematic review of the currently available large data base we conclude that in general the differences between parenting of boys versus girls are minimal.

  3. Practice Patterns Compared with Evidence-based Strategies for the Management of Androgen Deprivation Therapy-Induced Side Effects in Prostate Cancer Patients: Results of a European Web-based Survey.

    PubMed

    Bultijnck, Renée; Surcel, Cristian; Ploussard, Guillaume; Briganti, Alberto; De Visschere, Pieter; Fütterer, Jurgen; Ghadjar, Pirus; Giannarini, Gianluca; Isbarn, Hendrik; Massard, Christophe; Sooriakumaran, Prasanna; Valerio, Massimo; van den Bergh, Roderick; Ost, Piet

    2016-12-01

    Evidence-based recommendations are available for the management of androgen deprivation therapy (ADT)-induced side effects; however, there are no data on the implementation of the recommendations into daily practice patterns. To compare practice patterns in the management of ADT-induced side effects with evidence-based strategies. A European Web-based survey was conducted from January 16, 2015, to June 24, 2015. The 25-item questionnaire was designed with the aid of expert opinion and covered general respondent information, ADT preference per disease stage, patient communication on ADT-induced side effects, and strategies to mitigate side effects. All questions referred to patients with long-term ADT use. Reported practice patterns were compared with available evidence-based strategies. Following data collection, descriptive statistics were used for analysis. Frequency distributions were compiled and compared using a generalised chi-square test. In total, 489 eligible respondents completed the survey. Luteinising hormone-releasing hormone-agonist with or without an antiandrogen was the preferred method of ADT in different settings. Patients were well informed about loss of libido (90%), hot flushes (85%), fatigue (67%), and osteoporosis (63%). An osteoporotic and metabolic risk assessment prior to commencing ADT was done by one-quarter of physicians. The majority (85%) took preventive measures and applied at least one evidence-based strategy. Exercise was recommended by three-quarters of physicians who advocate its positive effects; however, only 25% of physicians had access to exercise programmes. Although the minimum sample size was set at 400 participants, the current survey remains susceptible to volunteer and nonresponder bias. Patients were well informed about several ADT-induced complications but uncommonly underwent an osteoporotic and metabolic risk assessment. Nevertheless, physicians partially provided evidence-based strategies for the management of the complications. Physicians often advised exercise to reduce ADT-induced side effects, but programmes were not widely available. Implementation of evidence-based strategies for androgen deprivation therapy-induced side effects in real-life practice patterns should be improved. Copyright © 2016 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  4. Stakeholders' Views of South Korea's Higher Education Internationalization Policy

    ERIC Educational Resources Information Center

    Cho, Young Ha; Palmer, John D.

    2013-01-01

    The study investigated the stakeholders' perceptions of South Korea's higher education internationalization policy. Based on the research framework that defines four policy values--propriety, effectiveness, diversity, and engagement, the convergence model was employed with a concurrent mixed method sampling strategy to analyze the stakeholders'…

  5. Investigation of Low Cost Sensor-Based Leak Detection System for Fence Line Applications

    EPA Science Inventory

    With recent technological advances, low-cost time-resolved sensors may become effective tools to support time-integrated passive sampling strategies by helping to decipher origin of emissions in real-time. As part of the Petroleum Refinery Risk and Technology Review, New Source ...

  6. The 1963 Hip-Hop Machine: Hip-Hop Pedagogy as Composition.

    ERIC Educational Resources Information Center

    Rice, Jeff

    2003-01-01

    Proposes an alternative invention strategy for research-based argumentative writing. Investigates the coincidental usage of the term "whatever" in hip-hop, theory, and composition studies. Presents a "whatever-pedagogy" identified as "hip-hop pedagogy," a writing practice that models itself after digital sampling's…

  7. 78 FR 54253 - Proposed Data Collections Submitted for Public Comment and Recommendations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-03

    ... every level of the social ecology, including individuals, families, and communities, and thus have the... effective population-based prevention strategies. Approximately 188 Colorado state and county employees and partners form the sample population. Specifically, state- and county-level employees working in welfare and...

  8. Power and Communication Behavior: A Formulative Investigation.

    ERIC Educational Resources Information Center

    Cavanaugh, Mary; And Others

    As part of a research program designed to explicate the construct "power," a formulative research strategy was used to identify 37 personal orientations toward power. A preliminary instrument based on these orientations was administered to samples of corporate executives, government employees, law enforcement personnel, and sales associates. A…

  9. Sampling Strategies for Evaluating the Rate of Adventitious Transgene Presence in Non-Genetically Modified Crop Fields.

    PubMed

    Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine

    2017-09-01

    According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.

  10. Optimal sensor placement for time-domain identification using a wavelet-based genetic algorithm

    NASA Astrophysics Data System (ADS)

    Mahdavi, Seyed Hossein; Razak, Hashim Abdul

    2016-06-01

    This paper presents a wavelet-based genetic algorithm strategy for optimal sensor placement (OSP) effective for time-domain structural identification. Initially, the GA-based fitness evaluation is significantly improved by using adaptive wavelet functions. Later, a multi-species decimal GA coding system is modified to be suitable for an efficient search around the local optima. In this regard, a local operation of mutation is introduced in addition with regeneration and reintroduction operators. It is concluded that different characteristics of applied force influence the features of structural responses, and therefore the accuracy of time-domain structural identification is directly affected. Thus, the reliable OSP strategy prior to the time-domain identification will be achieved by those methods dealing with minimizing the distance of simulated responses for the entire system and condensed system considering the force effects. The numerical and experimental verification on the effectiveness of the proposed strategy demonstrates the considerably high computational performance of the proposed OSP strategy, in terms of computational cost and the accuracy of identification. It is deduced that the robustness of the proposed OSP algorithm lies in the precise and fast fitness evaluation at larger sampling rates which result in the optimum evaluation of the GA-based exploration and exploitation phases towards the global optimum solution.

  11. The Use of Research-Based Instructional Strategies in Introductory Physics: Where do Faculty Leave the Innovation-Decision Process?

    NASA Astrophysics Data System (ADS)

    Henderson, Charles; Dancy, Melissa; Niewiadomska-Bugaj, Magdalena

    2013-03-01

    During the Fall of 2008 a web survey was completed by a representative sample of 722 United States physics faculty. In this talk we will briefly present summary statistics to describe faculty knowledge about and use of 24 specific research-based instructional strategies (RBIS). We will then analyze the results based on a four stage model of the innovation-decision process: knowledge, trial, continuation, and high use. The largest losses occur at the continuation stage, with approximately 1/3 of faculty discontinuing use of all RBIS after trying one or more of these strategies. These results suggest that common dissemination strategies are good at creating knowledge about RBIS and motivation to try a RBIS, but more work is needed to support faculty during implementation and continued use of RBIS. Based on a logistic regression analysis, only nine of the 20 potential predictor variables measured were statistically significant when controlling for other variables. Faculty age, institutional type, and percentage of job related to teaching were not found to be correlated with knowledge or use at any stage. High research productivity and large class sizes were not found to be barriers to use of at least some RBIS. Supported by NSF #0715698.

  12. Unintended Consequences of Evidence-Based Treatment Policy Reform: Is Implementation the Goal or the Strategy for Higher Quality Care?

    PubMed

    Park, Alayna L; Tsai, Katherine H; Guan, Karen; Chorpita, Bruce F

    2018-02-14

    This study examined patterns of evidence-based treatment (EBT) delivery following a county-wide EBT reform initiative. Data were gathered from 60 youth and their 21 providers, who were instructed to deliver therapy as they normally would under the EBT initiative. Results showed limited applicability of county-supported EBTs to this service sample, and that most youth did not receive traditional delivery of EBTs. Findings suggest that it may be unrealistic to expect providers to deliver EBTs with fidelity with all clients, and that EBT implementation may be best thought of as a strategy for improving mental health services rather than a goal.

  13. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Occupational stressors, burnout and coping strategies between hospital and community psychiatric nurses in a Dublin region.

    PubMed

    McTiernan, K; McDonald, N

    2015-04-01

    Burnout negatively impacts the delivery of mental health services. Psychiatric nurses face stressors that are distinct from other nursing specialities. The research was conducted in Ireland and captured a relatively large sample of respondents. The results compared the stressors, coping strategies and burnout levels between hospital and community-based psychiatric nurses. Occupational stress can negatively impact on the well-being of psychiatric nurses, which in turn can lead to poor client care. There is a dearth of published research conducted in Ireland that examines stress within the discipline. A between-groups study, undertaken in February 2011, investigated stressors, burnout and coping strategies between hospital and community-based psychiatric nurses in a Dublin region. Sixty-nine participants (8 males and 61 females), aged between 18 to 60 years voluntarily completed the Mental Health Professional Stress Scale, the Maslach Burnout Inventory and the PsychNurse Methods of Coping Scale. The findings revealed that nurses were operating in a moderately stressful environment. Stressors focused on organizational issues as opposed to client issues. The main stressors identified were lack of resources, workload and organizational structures/processes. Both groups reported average levels of emotional exhaustion, low levels of depersonalization and average levels of personal accomplishment. A Mann-Whitney U-test and Independent Samples t-test found significant differences between hospital and community-based nurses regarding depersonalization and personal accomplishment, respectively. Hospital nurses reported higher depersonalization scores, and community nurses had a greater sense of personal accomplishment. The personal accomplishment scores of hospital nurses were below mental health professional norms. No significant differences emerged regarding coping strategies. Avoidant coping strategies were favoured by both groups. It is recommended that interventions aimed at increasing personal accomplishment be implemented for both groups of nurses. An investigation of perceived job control and the use of social support would also be insightful. © 2014 John Wiley & Sons Ltd.

  15. Active learning based segmentation of Crohns disease from abdominal MRI.

    PubMed

    Mahapatra, Dwarikanath; Vos, Franciscus M; Buhmann, Joachim M

    2016-05-01

    This paper proposes a novel active learning (AL) framework, and combines it with semi supervised learning (SSL) for segmenting Crohns disease (CD) tissues from abdominal magnetic resonance (MR) images. Robust fully supervised learning (FSL) based classifiers require lots of labeled data of different disease severities. Obtaining such data is time consuming and requires considerable expertise. SSL methods use a few labeled samples, and leverage the information from many unlabeled samples to train an accurate classifier. AL queries labels of most informative samples and maximizes gain from the labeling effort. Our primary contribution is in designing a query strategy that combines novel context information with classification uncertainty and feature similarity. Combining SSL and AL gives a robust segmentation method that: (1) optimally uses few labeled samples and many unlabeled samples; and (2) requires lower training time. Experimental results show our method achieves higher segmentation accuracy than FSL methods with fewer samples and reduced training effort. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Over-expression and purification strategies for recombinant multi-protein oligomers: a case study of Mycobacterium tuberculosis σ/anti-σ factor protein complexes.

    PubMed

    Thakur, Krishan Gopal; Jaiswal, Ravi Kumar; Shukla, Jinal K; Praveena, T; Gopal, B

    2010-12-01

    The function of a protein in a cell often involves coordinated interactions with one or several regulatory partners. It is thus imperative to characterize a protein both in isolation as well as in the context of its complex with an interacting partner. High resolution structural information determined by X-ray crystallography and Nuclear Magnetic Resonance offer the best route to characterize protein complexes. These techniques, however, require highly purified and homogenous protein samples at high concentration. This requirement often presents a major hurdle for structural studies. Here we present a strategy based on co-expression and co-purification to obtain recombinant multi-protein complexes in the quantity and concentration range that can enable hitherto intractable structural projects. The feasibility of this strategy was examined using the σ factor/anti-σ factor protein complexes from Mycobacterium tuberculosis. The approach was successful across a wide range of σ factors and their cognate interacting partners. It thus appears likely that the analysis of these complexes based on variations in expression constructs and procedures for the purification and characterization of these recombinant protein samples would be widely applicable for other multi-protein systems. Copyright © 2010 Elsevier Inc. All rights reserved.

  17. Synergistic Tailoring of Electrostatic and Hydrophobic Interactions for Rapid and Specific Recognition of Lysophosphatidic Acid, an Early-Stage Ovarian Cancer Biomarker.

    PubMed

    Wang, Ying; Pei, Hanwen; Jia, Yan; Liu, Jianhua; Li, Zelun; Ai, Kelong; Lu, Zhongyuan; Lu, Lehui

    2017-08-23

    Early detection of ovarian cancer, the most lethal type of gynecologic cancer, can dramatically improve the efficacy of available treatment strategies. However, few screening tools exist for rapidly and effectively diagnosing ovarian cancer in early stages. Here, we present a facile "lock-key" strategy, based on rapid, specific detection of plasma lysophosphatidic acid (LPA, an early stage biomarker) with polydiacetylenes (PDAs)-based probe, for the early diagnosis of ovarian cancer. This strategy relies on specifically inserting LPA "key" into the PDAs "lock" through the synergistic electrostatic and hydrophobic interactions between them, leading to conformation transition of the PDA backbone with a concomitant blue-to-red color change. The detailed mechanism underlying the high selectivity of PDAs toward LPA is revealed by comprehensive theoretical calculation and experiments. Moreover, the level of LPA can be quantified in plasma samples from both mouse xenograft tumor models and patients with ovarian cancer. Impressively, this approach can be introduced into a portable point-of-care device to successfully distinguish the blood samples of patients with ovarian cancer from those of healthy people, with 100% accuracy. This work provides a valuable portable tool for early diagnosis of ovarian cancer and thus holds a great promise to dramatically improve the overall survival.

  18. Quality based approach for adaptive face recognition

    NASA Astrophysics Data System (ADS)

    Abboud, Ali J.; Sellahewa, Harin; Jassim, Sabah A.

    2009-05-01

    Recent advances in biometric technology have pushed towards more robust and reliable systems. We aim to build systems that have low recognition errors and are less affected by variation in recording conditions. Recognition errors are often attributed to the usage of low quality biometric samples. Hence, there is a need to develop new intelligent techniques and strategies to automatically measure/quantify the quality of biometric image samples and if necessary restore image quality according to the need of the intended application. In this paper, we present no-reference image quality measures in the spatial domain that have impact on face recognition. The first is called symmetrical adaptive local quality index (SALQI) and the second is called middle halve (MH). Also, an adaptive strategy has been developed to select the best way to restore the image quality, called symmetrical adaptive histogram equalization (SAHE). The main benefits of using quality measures for adaptive strategy are: (1) avoidance of excessive unnecessary enhancement procedures that may cause undesired artifacts, and (2) reduced computational complexity which is essential for real time applications. We test the success of the proposed measures and adaptive approach for a wavelet-based face recognition system that uses the nearest neighborhood classifier. We shall demonstrate noticeable improvements in the performance of adaptive face recognition system over the corresponding non-adaptive scheme.

  19. A generalized least-squares framework for rare-variant analysis in family data.

    PubMed

    Li, Dalin; Rotter, Jerome I; Guo, Xiuqing

    2014-01-01

    Rare variants may, in part, explain some of the hereditability missing in current genome-wide association studies. Many gene-based rare-variant analysis approaches proposed in recent years are aimed at population-based samples, although analysis strategies for family-based samples are clearly warranted since the family-based design has the potential to enhance our ability to enrich for rare causal variants. We have recently developed the generalized least squares, sequence kernel association test, or GLS-SKAT, approach for the rare-variant analyses in family samples, in which the kinship matrix that was computed from the high dimension genetic data was used to decorrelate the family structure. We then applied the SKAT-O approach for gene-/region-based inference in the decorrelated data. In this study, we applied this GLS-SKAT method to the systolic blood pressure data in the simulated family sample distributed by the Genetic Analysis Workshop 18. We compared the GLS-SKAT approach to the rare-variant analysis approach implemented in family-based association test-v1 and demonstrated that the GLS-SKAT approach provides superior power and good control of type I error rate.

  20. Influences of sampling size and pattern on the uncertainty of correlation estimation between soil water content and its influencing factors

    NASA Astrophysics Data System (ADS)

    Lai, Xiaoming; Zhu, Qing; Zhou, Zhiwen; Liao, Kaihua

    2017-12-01

    In this study, seven random combination sampling strategies were applied to investigate the uncertainties in estimating the hillslope mean soil water content (SWC) and correlation coefficients between the SWC and soil/terrain properties on a tea + bamboo hillslope. One of the sampling strategies is the global random sampling and the other six are the stratified random sampling on the top, middle, toe, top + mid, top + toe and mid + toe slope positions. When each sampling strategy was applied, sample sizes were gradually reduced and each sampling size contained 3000 replicates. Under each sampling size of each sampling strategy, the relative errors (REs) and coefficients of variation (CVs) of the estimated hillslope mean SWC and correlation coefficients between the SWC and soil/terrain properties were calculated to quantify the accuracy and uncertainty. The results showed that the uncertainty of the estimations decreased as the sampling size increasing. However, larger sample sizes were required to reduce the uncertainty in correlation coefficient estimation than in hillslope mean SWC estimation. Under global random sampling, 12 randomly sampled sites on this hillslope were adequate to estimate the hillslope mean SWC with RE and CV ≤10%. However, at least 72 randomly sampled sites were needed to ensure the estimated correlation coefficients with REs and CVs ≤10%. Comparing with all sampling strategies, reducing sampling sites on the middle slope had the least influence on the estimation of hillslope mean SWC and correlation coefficients. Under this strategy, 60 sites (10 on the middle slope and 50 on the top and toe slopes) were enough to ensure the estimated correlation coefficients with REs and CVs ≤10%. This suggested that when designing the SWC sampling, the proportion of sites on the middle slope can be reduced to 16.7% of the total number of sites. Findings of this study will be useful for the optimal SWC sampling design.

  1. Influence of survey strategy and interpolation model on DEM quality

    NASA Astrophysics Data System (ADS)

    Heritage, George L.; Milan, David J.; Large, Andrew R. G.; Fuller, Ian C.

    2009-11-01

    Accurate characterisation of morphology is critical to many studies in the field of geomorphology, particularly those dealing with changes over time. Digital elevation models (DEMs) are commonly used to represent morphology in three dimensions. The quality of the DEM is largely a function of the accuracy of individual survey points, field survey strategy, and the method of interpolation. Recommendations concerning field survey strategy and appropriate methods of interpolation are currently lacking. Furthermore, the majority of studies to date consider error to be uniform across a surface. This study quantifies survey strategy and interpolation error for a gravel bar on the River Nent, Blagill, Cumbria, UK. Five sampling strategies were compared: (i) cross section; (ii) bar outline only; (iii) bar and chute outline; (iv) bar and chute outline with spot heights; and (v) aerial LiDAR equivalent, derived from degraded terrestrial laser scan (TLS) data. Digital Elevation Models were then produced using five different common interpolation algorithms. Each resultant DEM was differentiated from a terrestrial laser scan of the gravel bar surface in order to define the spatial distribution of vertical and volumetric error. Overall triangulation with linear interpolation (TIN) or point kriging appeared to provide the best interpolators for the bar surface. Lowest error on average was found for the simulated aerial LiDAR survey strategy, regardless of interpolation technique. However, comparably low errors were also found for the bar-chute-spot sampling strategy when TINs or point kriging was used as the interpolator. The magnitude of the errors between survey strategy exceeded those found between interpolation technique for a specific survey strategy. Strong relationships between local surface topographic variation (as defined by the standard deviation of vertical elevations in a 0.2-m diameter moving window), and DEM errors were also found, with much greater errors found at slope breaks such as bank edges. A series of curves are presented that demonstrate these relationships for each interpolation and survey strategy. The simulated aerial LiDAR data set displayed the lowest errors across the flatter surfaces; however, sharp slope breaks are better modelled by the morphologically based survey strategy. The curves presented have general application to spatially distributed data of river beds and may be applied to standard deviation grids to predict spatial error within a surface, depending upon sampling strategy and interpolation algorithm.

  2. Antibodies as means for selective mass spectrometry.

    PubMed

    Boström, Tove; Takanen, Jenny Ottosson; Hober, Sophia

    2016-05-15

    For protein analysis of biological samples, two major strategies are used today; mass spectrometry (MS) and antibody-based methods. Each strategy offers advantages and drawbacks. However, combining the two using an immunoenrichment step with MS analysis brings together the benefits of each method resulting in increased sensitivity, faster analysis and possibility of higher degrees of multiplexing. The immunoenrichment can be performed either on protein or peptide level and quantification standards can be added in order to enable determination of the absolute protein concentration in the sample. The combination of immunoenrichment and MS holds great promise for the future in both proteomics and clinical diagnostics. This review describes different setups of immunoenrichment coupled to mass spectrometry and how these can be utilized in various applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Quantitation of the phosphoproteome using the library-assisted extracted ion chromatogram (LAXIC) strategy.

    PubMed

    Arrington, Justine V; Xue, Liang; Tao, W Andy

    2014-01-01

    Phosphorylation is a key posttranslational modification that regulates many signaling pathways, but quantifying changes in phosphorylation between samples can be challenging due to its low stoichiometry within cells. We have introduced a mass spectrometry-based label-free quantitation strategy termed LAXIC for the analysis of the phosphoproteome. This method uses a spiked-in synthetic peptide library designed to elute across the entire chromatogram for local normalization of phosphopeptides within complex samples. Normalization of phosphopeptides by library peptides that co-elute within a small time frame accounts for fluctuating ion suppression effects, allowing more accurate quantitation even when LC-MS performance varies. Here we explain the premise of LAXIC, the design of a suitable peptide library, and how the LAXIC algorithm can be implemented with software developed in-house.

  4. Target-induced formation of gold amalgamation on DNA-based sensing platform for electrochemical monitoring of mercury ion coupling with cycling signal amplification strategy.

    PubMed

    Chen, Jinfeng; Tang, Juan; Zhou, Jun; Zhang, Lan; Chen, Guonan; Tang, Dianping

    2014-01-31

    Heavy metal ion pollution poses severe risks in human health and environmental pollutant, because of the likelihood of bioaccumulation and toxicity. Driven by the requirement to monitor trace-level mercury ion (Hg(2+)), herein we construct a new DNA-based sensor for sensitive electrochemical monitoring of Hg(2+) by coupling target-induced formation of gold amalgamation on DNA-based sensing platform with gold amalgamation-catalyzed cycling signal amplification strategy. The sensor was simply prepared by covalent conjugation of aminated poly-T(25) oligonucleotide onto the glassy carbon electrode by typical carbodiimide coupling. Upon introduction of target analyte, Hg(2+) ion was intercalated into the DNA polyion complex membrane based on T-Hg(2+)-T coordination chemistry. The chelated Hg(2+) ion could induce the formation of gold amalgamation, which could catalyze the p-nitrophenol with the aid of NaBH4 and Ru(NH3)6(3+) for cycling signal amplification. Experimental results indicated that the electronic signal of our system increased with the increasing Hg(2+) level in the sample, and has a detection limit of 0.02nM with a dynamic range of up to 1000nM Hg(2+). The strategy afforded exquisite selectivity for Hg(2+) against other environmentally related metal ions. In addition, the methodology was evaluated for the analysis of Hg(2+) in spiked tap-water samples, and the recovery was 87.9-113.8%. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. A typology for campus-based alcohol prevention: moving toward environmental management strategies.

    PubMed

    DeJong, William; Langford, Linda M

    2002-03-01

    This article outlines a typology of programs and policies for preventing and treating campus-based alcohol-related problems, reviews recent case studies showing the promise of campus-based environmental management strategies and reports findings from a national survey of U.S. colleges and universities about available resources for pursuing environmentally focused prevention. The typology is grounded in a social ecological framework, which recognizes that health-related behaviors are affected through multiple levels of influence: intrapersonal (individual) factors, interpersonal (group) processes, institutional factors, community factors and public policy. The survey on prevention resources and activities was mailed to senior administrators responsible for their school's institutional response to substance use problems. The study sample was an equal probability sample of 365 2- and 4-year U.S. campuses. The response rate was 76.9%. Recent case studies suggest the value of environmentally focused alcohol prevention approaches on campus, but more rigorous research is needed to establish their effectiveness. The administrators' survey showed that most U.S. colleges have not yet installed the basic infrastructure required for developing, implementing and evaluating environmental management strategies. The typology of campus-based prevention options can be used to categorize current efforts and to inform strategic planning of multilevel interventions. Additional colleges and universities should establish a permanent campus task force that reports directly to the president, participate actively in a campus-community coalition that seeks to change the availability of alcohol in the local community and join a state-level association that speaks out on state and federal policy issues.

  6. An Asymptotically-Optimal Sampling-Based Algorithm for Bi-directional Motion Planning

    PubMed Central

    Starek, Joseph A.; Gomez, Javier V.; Schmerling, Edward; Janson, Lucas; Moreno, Luis; Pavone, Marco

    2015-01-01

    Bi-directional search is a widely used strategy to increase the success and convergence rates of sampling-based motion planning algorithms. Yet, few results are available that merge both bi-directional search and asymptotic optimality into existing optimal planners, such as PRM*, RRT*, and FMT*. The objective of this paper is to fill this gap. Specifically, this paper presents a bi-directional, sampling-based, asymptotically-optimal algorithm named Bi-directional FMT* (BFMT*) that extends the Fast Marching Tree (FMT*) algorithm to bidirectional search while preserving its key properties, chiefly lazy search and asymptotic optimality through convergence in probability. BFMT* performs a two-source, lazy dynamic programming recursion over a set of randomly-drawn samples, correspondingly generating two search trees: one in cost-to-come space from the initial configuration and another in cost-to-go space from the goal configuration. Numerical experiments illustrate the advantages of BFMT* over its unidirectional counterpart, as well as a number of other state-of-the-art planners. PMID:27004130

  7. Natural sampling strategy

    NASA Technical Reports Server (NTRS)

    Hallum, C. R.; Basu, J. P. (Principal Investigator)

    1979-01-01

    A natural stratum-based sampling scheme and the aggregation procedures for estimating wheat area, yield, and production and their associated prediction error estimates are described. The methodology utilizes LANDSAT imagery and agrophysical data to permit an improved stratification in foreign areas by ignoring political boundaries and restratifying along boundaries that are more homogeneous with respect to the distribution of agricultural density, soil characteristics, and average climatic conditions. A summary of test results is given including a discussion of the various problems encountered.

  8. Effectiveness of Hydraulic Parameterization Strategies for Simulating Moisture Dynamics in a Deep Semi-Arid Vadose Zone

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Schaap, M. G.

    2012-12-01

    Over the past fifteen years, the University of Arizona has carried out four controlled infiltration experiments in a 3600 m2, 15 meter deep vadose zone (Maricopa, Arizona) in which the evolution of moisture content (9 wells, 25 cm resolution), and matric potential (27 locations) was monitored and the subsurface stratigraphy, texture (1042 samples), and bulk density (251 samples) was characterized. In order to simulate the subsurface moisture dynamics it is necessary to define the 3D structure of the subsurface hydraulic characteristics (i.e. moisture retention and hydraulic functions). Several simple to complex strategies are possible ranging from stratigraphy based layering using hydraulic parameters derived from core samples to sophisticated numerical inversions based on 3D geostatistics and site-specific pedotransfer functions. A range of approaches will be evaluated on objective metrics that quantify how well the observed moisture dynamics are matched by simulations. We will evaluate the worth of auxiliary data such as observed matric potentials and quantity the number of texture samples needed to arrive at effective descriptions of subsurface structure. In addition, we will discuss more subjective metrics that evaluate the relative effort involved and estimate monetary cost of each method. While some of the results will only be valid for the studied site, some general conclusions will be possible about the effectiveness of particular methods for other semi-arid sites.

  9. ELIMINATION OF THE CHARACTERIZATION OF DWPF POUR STREAM SAMPLE AND THE GLASS FABRICATION AND TESTING OF THE DWPF SLUDGE BATCH QUALIFICATION SAMPLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amoroso, J.; Peeler, D.; Edwards, T.

    2012-05-11

    A recommendation to eliminate all characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification sample was made by a Six-Sigma team chartered to eliminate non-value-added activities for the Defense Waste Processing Facility (DWPF) sludge batch qualification program and is documented in the report SS-PIP-2006-00030. That recommendation was supported through a technical data review by the Savannah River National Laboratory (SRNL) and is documented in the memorandums SRNL-PSE-2007-00079 and SRNL-PSE-2007-00080. At the time of writing those memorandums, the DWPF was processing sludge-only waste but, has since transitioned to a coupledmore » operation (sludge and salt). The SRNL was recently tasked to perform a similar data review relevant to coupled operations and re-evaluate the previous recommendations. This report evaluates the validity of eliminating the characterization of pour stream glass samples and the glass fabrication and Product Consistency Test (PCT) of the sludge batch qualification samples based on sludge-only and coupled operations. The pour stream sample has confirmed the DWPF's ability to produce an acceptable waste form from Slurry Mix Evaporator (SME) blending and product composition/durability predictions for the previous sixteen years but, ultimately the pour stream analysis has added minimal value to the DWPF's waste qualification strategy. Similarly, the information gained from the glass fabrication and PCT of the sludge batch qualification sample was determined to add minimal value to the waste qualification strategy since that sample is routinely not representative of the waste composition ultimately processed at the DWPF due to blending and salt processing considerations. Moreover, the qualification process has repeatedly confirmed minimal differences in glass behavior from actual radioactive waste to glasses fabricated from simulants or batch chemicals. In contrast, the variability study has significantly added value to the DWPF's qualification strategy. The variability study has evolved to become the primary aspect of the DWPF's compliance strategy as it has been shown to be versatile and capable of adapting to the DWPF's various and diverse waste streams and blending strategies. The variability study, which aims to ensure durability requirements and the PCT and chemical composition correlations are valid for the compositional region to be processed at the DWPF, must continue to be performed. Due to the importance of the variability study and its place in the DWPF's qualification strategy, it will also be discussed in this report. An analysis of historical data and Production Records indicated that the recommendation of the Six Sigma team to eliminate all characterization of pour stream glass samples and the glass fabrication and PCT performed with the qualification glass does not compromise the DWPF's current compliance plan. Furthermore, the DWPF should continue to produce an acceptable waste form following the remaining elements of the Glass Product Control Program; regardless of a sludge-only or coupled operations strategy. If the DWPF does decide to eliminate the characterization of pour stream samples, pour stream samples should continue to be collected for archival reasons, which would allow testing to be performed should any issues arise or new repository test methods be developed.« less

  10. Value of recruitment strategies used in a primary care practice-based trial.

    PubMed

    Ellis, Shellie D; Bertoni, Alain G; Bonds, Denise E; Clinch, C Randall; Balasubramanyam, Aarthi; Blackwell, Caroline; Chen, Haiying; Lischke, Michael; Goff, David C

    2007-05-01

    "Physicians-recruiting-physicians" is the preferred recruitment approach for practice-based research. However, yields are variable; and the approach can be costly and lead to biased, unrepresentative samples. We sought to explore the potential efficiency of alternative methods. We conducted a retrospective analysis of the yield and cost of 10 recruitment strategies used to recruit primary care practices to a randomized trial to improve cardiovascular disease risk factor management. We measured response and recruitment yields and the resources used to estimate the value of each strategy. Providers at recruited practices were surveyed about motivation for participation. Response to 6 opt-in marketing strategies was 0.40% (53/13290), ranging from 0% to 2.86% by strategy; 33.96% (18/53) of responders were recruited to the study. Of those recruited from opt-out strategies, 8.68% joined the study, ranging from 5.35% to 41.67% per strategy. A strategy that combined both opt-in and opt-out approaches resulted in a 51.14% (90/176) response and a 10.80% (19/90) recruitment rate. Cost of recruitment was $613 per recruited practice. Recruitment approaches based on in-person meetings (41.67%), previous relationships (33.33%), and borrowing an Area Health Education Center's established networks (10.80%), yielded the most recruited practices per effort and were most cost efficient. Individual providers who chose to participate were motivated by interest in improving their clinical practice (80.5%); contributing to CVD primary prevention (54.4%); and invigorating their practice with new ideas (42.1%). This analysis provides suggestions for future recruitment efforts and research. Translational studies with limited funds could consider multi-modal recruitment approaches including in-person presentations to practice groups and exploitation of previous relationships, which require the providers to opt-out, and interactive opt-in approaches which rely on borrowed networks. These approaches can be supplemented with non-relationship-based opt-out strategies such as cold calls strategically targeted to underrepresented provider groups.

  11. Analysis of mathematical problem-solving ability based on metacognition on problem-based learning

    NASA Astrophysics Data System (ADS)

    Mulyono; Hadiyanti, R.

    2018-03-01

    Problem-solving is the primary purpose of the mathematics curriculum. Problem-solving abilities influenced beliefs and metacognition. Metacognition as superordinate capabilities can direct, regulate cognition and motivation and then problem-solving processes. This study aims to (1) test and analyzes the quality of problem-based learning and (2) investigate the problem-solving capabilities based on metacognition. This research uses mixed method study with The subject research are class XI students of Mathematics and Science at High School Kesatrian 2 Semarang which divided into tacit use, aware use, strategic use and reflective use level. The collecting data using scale, interviews, and tests. The data processed with the proportion of test, t-test, and paired samples t-test. The result shows that the students with levels tacit use were able to complete the whole matter given, but do not understand what and why a strategy is used. Students with aware use level were able to solve the problem, be able to build new knowledge through problem-solving to the indicators, understand the problem, determine the strategies used, although not right. Students on the Strategic ladder Use can be applied and adopt a wide variety of appropriate strategies to solve the issues and achieved re-examine indicators of process and outcome. The student with reflective use level is not found in this study. Based on the results suggested that study about the identification of metacognition in problem-solving so that the characteristics of each level of metacognition more clearly in a more significant sampling. Teachers need to know in depth about the student metacognitive activity and its relationship with mathematical problem solving and another problem resolution.

  12. Prevalence of urinary tract infection in acutely unwell children in general practice: a prospective study with systematic urine sampling.

    PubMed

    O'Brien, Kathryn; Edwards, Adrian; Hood, Kerenza; Butler, Christopher C

    2013-02-01

    Urinary tract infection (UTI) in children may be associated with long-term complications that could be prevented by prompt treatment. To determine the prevalence of UTI in acutely ill children ≤ 5 years presenting in general practice and to explore patterns of presenting symptoms and urine sampling strategies. Prospective observational study with systematic urine sampling, in general practices in Wales, UK. In total, 1003 children were recruited from 13 general practices between March 2008 and July 2010. The prevalence of UTI was determined and multivariable analysis performed to determine the probability of UTI. Out of 597 (60.0%) children who provided urine samples within 2 days, the prevalence of UTI was 5.9% (95% confidence interval [CI] = 4.3% to 8.0%) overall, 7.3% in those < 3 years and 3.2% in 3-5 year olds. Neither a history of fever nor the absence of an alternative source of infection was associated with UTI (P = 0.64; P = 0.69, respectively). The probability of UTI in children aged ≥3 years without increased urinary frequency or dysuria was 2%. The probability of UTI was ≥5% in all other groups. Urine sampling based purely on GP suspicion would have missed 80% of UTIs, while a sampling strategy based on current guidelines would have missed 50%. Approximately 6% of acutely unwell children presenting to UK general practice met the criteria for a laboratory diagnosis of UTI. This higher than previously recognised prior probability of UTI warrants raised awareness of the condition and suggests clinicians should lower their threshold for urine sampling in young children. The absence of fever or presence of an alternative source of infection, as emphasised in current guidelines, may not rule out UTI in young children with adequate certainty.

  13. Optimizing 4-Dimensional Magnetic Resonance Imaging Data Sampling for Respiratory Motion Analysis of Pancreatic Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stemkens, Bjorn, E-mail: b.stemkens@umcutrecht.nl; Tijssen, Rob H.N.; Senneville, Baudouin D. de

    2015-03-01

    Purpose: To determine the optimum sampling strategy for retrospective reconstruction of 4-dimensional (4D) MR data for nonrigid motion characterization of tumor and organs at risk for radiation therapy purposes. Methods and Materials: For optimization, we compared 2 surrogate signals (external respiratory bellows and internal MRI navigators) and 2 MR sampling strategies (Cartesian and radial) in terms of image quality and robustness. Using the optimized protocol, 6 pancreatic cancer patients were scanned to calculate the 4D motion. Region of interest analysis was performed to characterize the respiratory-induced motion of the tumor and organs at risk simultaneously. Results: The MRI navigator was foundmore » to be a more reliable surrogate for pancreatic motion than the respiratory bellows signal. Radial sampling is most benign for undersampling artifacts and intraview motion. Motion characterization revealed interorgan and interpatient variation, as well as heterogeneity within the tumor. Conclusions: A robust 4D-MRI method, based on clinically available protocols, is presented and successfully applied to characterize the abdominal motion in a small number of pancreatic cancer patients.« less

  14. Reduced prevalence and severity of wounds following implementation of the Champions for Skin Integrity model to facilitate uptake of evidence-based practice in aged care.

    PubMed

    Edwards, Helen E; Chang, Anne M; Gibb, Michelle; Finlayson, Kathleen J; Parker, Christina; O'Reilly, Maria; McDowell, Jan; Shuter, Patricia

    2017-12-01

    To evaluate the implementation of the Champions for Skin Integrity model on facilitating uptake of evidence-based wound management and improving skin integrity in residents of aged care facilities. The incidence of skin tears, pressure injuries and leg ulcers increases with age, and such wounds can be a serious issue in aged care facilities. Older adults are not only at higher risk for wounds related to chronic disease but also injuries related to falls and manual handling requirements. A longitudinal, pre-post design. The Champions for Skin Integrity model was developed using evidence-based strategies for transfer of evidence into practice. Data were collected before and six months after implementation of the model. Data on wound management and skin integrity were obtained from two random samples of residents (n = 200 pre; n = 201 post) from seven aged care facilities. A staff survey was also undertaken (n = 126 pre; n = 143 post) of experience, knowledge and evidence-based wound management. Descriptive statistics were calculated for all variables. Where relevant, chi-square for independence or t-tests were used to identify differences between the pre-/postdata. There was a significant decrease in the number of residents with a wound of any type (54% pre vs 43% post, χ 2 4·2, p = 0·041), as well as a significant reduction in specific wound types, for example pressure injuries (24% pre vs 10% post, χ 2 14·1, p < 0·001), following implementation of the model. An increase in implementation of evidence-based wound management and prevention strategies was observed in the postimplementation sample in comparison with the preimplementation sample. This included use of limb protectors and/or protective clothing 6% pre vs 20% post (χ 2 17·0, p < 0·001) and use of an emollient or soap alternative for bathing residents (50% pre vs 74% post, χ 2 13·9, p = 0·001). Implementation of the model in this sample fostered an increase in implementation of evidence-based wound management and prevention strategies, which was associated with a decrease in the prevalence and severity of wounds. This study suggests the Champions for Skin Integrity model has the potential to improve uptake of evidence-based wound management and improve skin integrity for older adults. © 2017 John Wiley & Sons Ltd.

  15. Analysis of biofluids in aqueous environment based on mid-infrared spectroscopy.

    PubMed

    Fabian, Heinz; Lasch, Peter; Naumann, Dieter

    2005-01-01

    In this study we describe a semiautomatic Fourier transform infrared spectroscopic methodology for the analysis of liquid serum samples, which combines simple sample introduction with high sample throughput. The applicability of this new infrared technology to the analysis of liquid serum samples from a cohort of cattle naturally infected with bovine spongiform encephalopathy and from controls was explored in comparison to the conventional approach based on transmission infrared spectroscopy of dried serum films. Artifical neural network analysis of the infrared data was performed to differentiate between bovine spongiform encephalopathy-negative controls and animals in the late stage of the disease. After training of artifical neural network classifiers, infrared spectra of sera from an independent external validation data set were analyzed. In this way, sensitivities between 90 and 96% and specificities between 84 and 92% were achieved, respectively, depending upon the strategy of data collection and data analysis. Based on these results, the advantages and limitations of the liquid sample technique and the dried film approach for routine analysis of biofluids are discussed. 2005 Society of Photo-Optical Instrumentation Engineers.

  16. Competition and rural primary care programs.

    PubMed

    Ricketts, T C

    1990-04-01

    Rural primary care programs were established in areas where there was thought to be no competition for patients. However, evidence from site visits and surveys of a national sample of subsidized programs revealed a pattern of competitive responses by the clinics. In this study of 193 rural primary care programs, mail and telephone surveys produced uniform data on the organization, operation, finances, and utilization of a representative sample of clinics. The programs were found to compete in terms of: (1) price, (2) service mix, (3) staff availability, (4) structural accessibility, (5) outreach, and (6) targeting a segment of the market. The competitive strategies employed by the clinics had consequences that affected their productivity and financial stability. The strategies were related to the perceived missions of the programs, and depended heavily upon the degree of isolation of the program and the targeting of the services. The competitive strategy chosen by a particular program could not be predicted based on service area population and apparent competitors in the service area. The goals and objectives of the programs had more to do with their competitive responses than with market characteristics. Moreover, the chosen strategies may not meet the demands of those markets.

  17. The Effectiveness of Prompts to Promote Engagement With Digital Interventions: A Systematic Review.

    PubMed

    Alkhaldi, Ghadah; Hamilton, Fiona L; Lau, Rosa; Webster, Rosie; Michie, Susan; Murray, Elizabeth

    2016-01-08

    Digital interventions have been effective in improving numerous health outcomes and health behaviors; furthermore, they are increasingly being used in different health care areas, including self-management of long-term conditions, mental health, and health promotion. The full potential of digital interventions is hindered by a lack of user engagement. There is an urgent need to develop effective strategies that can promote users' engagement with digital interventions. One potential method is the use of technology-based reminders or prompts. To evaluate the effectiveness of technology-based strategies for promoting engagement with digital interventions. Cochrane Collaboration guidelines on systematic review methodology were followed. The search strategy was executed across 7 electronic databases: the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, Embase, Web of Science, the Education Resources Information Center (ERIC), PsycINFO, and the Cumulative Index to Nursing and Allied Health Literature (CINAHL). Databases were searched from inception to September 13, 2013, with no language or publication type restrictions, using three concepts: randomized controlled trials, digital interventions, and engagement. Gray literature and reference lists of included studies were also searched. Titles and abstracts were independently screened by 2 authors, then the full texts of potentially eligible papers were obtained and double-screened. Data from eligible papers were extracted by one author and checked for accuracy by another author. Bias was assessed using the Cochrane risk of bias assessment tool. Narrative synthesis was performed on all included studies and, where appropriate, data were pooled using meta-analysis. All findings were reported according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. A total of 14 studies were included in the review with 8774 participants. Of the 14 studies, 9 had sufficient data to be included in the meta-analyses. The meta-analyses suggested that technology-based strategies can potentially promote engagement compared to no strategy for dichotomous outcomes (relative risk [RR] 1.27, 95% CI 1.01-1.60, I(2)=71%), but due to considerable heterogeneity and the small sample sizes in most studies, this result should be treated with caution. No studies reported adverse or economic outcomes. Only one study with a small sample size compared different characteristics; the study found that strategies promoting new digital intervention content and those sent to users shortly after they started using the digital intervention were more likely to engage users. Overall, studies reported borderline positive effects of technology-based strategies on engagement compared to no strategy. However, the results have to be interpreted with caution. More research is needed to replicate findings and understand which characteristics of the strategies are effective in promoting engagement and how cost-effective they are.

  18. Predictors of Coping Strategies Employed by Iraqi Refugees in Jordan.

    PubMed

    Al-Smadi, Ahmed Mohammad; Tawalbeh, Loai Issa; Gammoh, Omar Salem; Ashour, Ala; Alzoubi, Fatmeh Ahmad; Slater, Paul

    2017-10-01

    The purpose of this study was to examine coping strategies used by Iraqi refugees in Jordan based on their demographic details. A cross-sectional design was used. A representative sample of 333 refugees living in Jordan participated in the study. The Cope inventory and the demographic details were compiled to produce and collate the relevant data. Being older, female, educated, single, and living with more than three family members was associated with greater use of the problem solving coping strategy. Being female, educated, and unemployed was associated with greater use of the active emotional coping strategy. In addition, being older, male, illiterate, unemployed, and living with less than three family members was associated with greater use of the avoidant emotional coping strategy. This study recommends a multidisciplinary approach intervention as being the best method of addressing and fulfilling the health and socioeconomic needs of older, male, illiterate, unemployed people.

  19. Pricing strategies, the strength of bidding intentions, and online auction performance: a cross-cultural study.

    PubMed

    Peng, Yu-Shu; Jan, Lih-Tsyr

    2009-10-01

    Over the past decade, electronic markets based on the Internet, particularly online auctions, have become popular venues for conducting business. Previous studies often focused on the construction of the best bidding model, while few studies have tried to integrate multiple pricing strategies to predict the probability of closing an auction and the price premium. This study constructs a mediated model to examine the relationship among pricing strategies, the strength of bidding intentions, and online auction performance. The sample consists of 1,055 auctions of iPod MP3 players from eBay Web sites in Hong Kong, Singapore, Belgium, and France. Empirical results show that the pricing strategies directly influence both the probability of closing an auction and the level of price premium. The pricing strategies also indirectly influence the price premium through the mediating effect of the strength of bidding intentions.

  20. Consistency assessment with global and bridging development strategies in emerging markets.

    PubMed

    Li, Gang; Chen, Josh; Quan, Hui; Shentu, Yue

    2013-11-01

    Global trial strategy with the participation of all major regions including countries from emerging markets surely increases new drug development efficiency. Nevertheless, there are circumstances in which some countries in emerging markets cannot join the original global trial. To evaluate the extrapolability of the original trial results to a new country, a bridging trial in the country has to be conducted. In this paper, we first evaluate the efficiency loss of the bridging trial strategy compared to that of the global trial strategy as a function of between-study variability from consistency assessment perspective. The provided evidence should encourage countries in emerging markets to make a greater effort to participate in the original global trial. We then discuss sample size requirement for desired assurance probability for consistency assessment based on various approaches for both global and bridging trial strategies. Examples are presented for numerical demonstration and comparisons. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. What Strategies Do Physicians and Patients Discuss to Reduce Out-of-Pocket Costs?

    PubMed Central

    Hunter, Wynn G.; Zhang, Cecilia Z.; Hesson, Ashley; Davis, J. Kelly; Kirby, Christine; Williamson, Lillie D.; Barnett, Jamison A.; Ubel, Peter A.

    2015-01-01

    Background More than 1 in 4 Americans report difficulty paying medical bills. Cost-reducing strategies discussed during outpatient physician visits remain poorly characterized. Objective We sought to determine how often patients and physicians discuss healthcare costs during outpatient visits and what strategies, if any, they discussed to lower patient out-of-pocket costs. Design Retrospective analysis of dialogue from 1,755 outpatient visits in community-based practices nationwide from 2010–2014. The study population included 677 patients with breast cancer, 422 with depression, and 656 with rheumatoid arthritis visiting 56 oncologists, 36 psychiatrists, and 26 rheumatologists, respectively. Results Thirty percent of visits contained cost conversations (95% confidence interval [CI], 28 to 32). Forty-four percent of cost conversations involved discussion of cost-saving strategies (95% CI, 40 to 48; median duration, 68 seconds). We identified 4 strategies to lower costs without changing the care plan – in order of overall frequency, (1) Changing logistics of care; (2) Facilitating copay assistance; (3) Providing free samples; (4) Changing/adding insurance plans – and 4 strategies to reduce costs by changing the care plan – (1) Switching to lower-cost alternative therapy/diagnostic; (2) Switching from brand name to generic; (3) Changing dosage/frequency; (4) Stopping/withholding interventions. Strategies were relatively consistent across health conditions, except for switching to lower-cost alternative (more common in breast oncology), and providing free samples (more common in depression). Limitation Focus on three conditions with potentially high out-of-pocket costs. Conclusions Despite price opacity, physicians and patients discuss a variety of out-of-pocket cost reduction strategies during clinic visits. Almost half of cost discussions mention one or more cost-saving strategies, with more frequent mention of those not requiring care-plan changes. PMID:26785714

  2. User-Driven Sampling Strategies in Image Exploitation

    DOE PAGES

    Harvey, Neal R.; Porter, Reid B.

    2013-12-23

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-drivenmore » sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. We discovered that in user-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. Furthermore, in preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.« less

  3. User-driven sampling strategies in image exploitation

    NASA Astrophysics Data System (ADS)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  4. Effects of Sampling Strategy, Detection Probability, and Independence of Counts on the Use of Point Counts

    Treesearch

    Grey W. Pendleton

    1995-01-01

    Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation...

  5. Prediction-based sampled-data H∞ controller design for attitude stabilisation of a rigid spacecraft with disturbances

    NASA Astrophysics Data System (ADS)

    Zhu, Baolong; Zhang, Zhiping; Zhou, Ding; Ma, Jie; Li, Shunli

    2017-08-01

    This paper investigates the H∞ control problem of the attitude stabilisation of a rigid spacecraft with external disturbances using prediction-based sampled-data control strategy. Aiming to achieve a 'virtual' closed-loop system, a type of parameterised sampled-data controller is designed by introducing a prediction mechanism. The resultant closed-loop system is equivalent to a hybrid system featured by a continuous-time and an impulsive differential system. By using a time-varying Lyapunov functional, a generalised bounded real lemma (GBRL) is first established for a kind of impulsive differential system. Based on this GBRL and Lyapunov functional approach, a sufficient condition is derived to guarantee the closed-loop system to be asymptotically stable and to achieve a prescribed H∞ performance. In addition, the controller parameter tuning is cast into a convex optimisation problem. Simulation and comparative results are provided to illustrate the effectiveness of the developed control scheme.

  6. Preservice Teachers' Sense of Efficacy: Video vs. Face-to-Face Observations

    ERIC Educational Resources Information Center

    Chisenhall, Debra Ellen

    2016-01-01

    This study examined preservice elementary education students' sense of efficacy regarding student engagement, instructional strategies, and classroom management based on the type of observations they completed. A total sample size of 64 elementary education students enrolled in four sections of an introductory elementary education course and…

  7. Investigation of a Low Cost Sensor-Based Leak Detection System for Fence Line Applications

    EPA Science Inventory

    With recent technological advances, low-cost time-resolved sensors may become effective tools to support time-integrated passive sampling strategies by helping to decipher origin of emissions in real-time. As part of the Petroleum Refinery Risk and Technology Review, New Source P...

  8. Learning to Write and Loving It! Preschool-Kindergarten

    ERIC Educational Resources Information Center

    Trehearne, Miriam P.

    2011-01-01

    "Learning to Write and Loving It!" equips teachers of young children with practical strategies, assessment tools, and motivating writing activities that are based on current research and proven practice and are easily applicable to all kinds of learning environments. Included are many authentic writing samples and photos to illustrate effective,…

  9. The Complete Guide to Training Delivery: A Competency-Based Approach.

    ERIC Educational Resources Information Center

    King, Stephen B.; King, Marsha; Rothwell, William J.

    This guide focuses on 14 instructor competencies identified by the International Board of Standards for Training, Performance, and Instruction. It provides examples, job aids, worksheets, case studies, and sample dialogs and contains actual experiences and critical incidents faced by trainers who participated in the survey study. Strategies to…

  10. Developing International Managers: The Contribution of Cultural Experience to Learning

    ERIC Educational Resources Information Center

    Townsend, Peter; Regan, Padraic; Li, Liang Liang

    2015-01-01

    Purpose: The purpose of this paper is to evaluate cultural experience as a learning strategy for developing international managers. Design/methodology/approach: Using an integrated framework, two quantitative studies, based on empirical methodology, are conducted. Study 1, with an undergraduate sample situated in the Asia Pacific, aimed to examine…

  11. Generalized essential energy space random walks to more effectively accelerate solute sampling in aqueous environment

    NASA Astrophysics Data System (ADS)

    Lv, Chao; Zheng, Lianqing; Yang, Wei

    2012-01-01

    Molecular dynamics sampling can be enhanced via the promoting of potential energy fluctuations, for instance, based on a Hamiltonian modified with the addition of a potential-energy-dependent biasing term. To overcome the diffusion sampling issue, which reveals the fact that enlargement of event-irrelevant energy fluctuations may abolish sampling efficiency, the essential energy space random walk (EESRW) approach was proposed earlier. To more effectively accelerate the sampling of solute conformations in aqueous environment, in the current work, we generalized the EESRW method to a two-dimension-EESRW (2D-EESRW) strategy. Specifically, the essential internal energy component of a focused region and the essential interaction energy component between the focused region and the environmental region are employed to define the two-dimensional essential energy space. This proposal is motivated by the general observation that in different conformational events, the two essential energy components have distinctive interplays. Model studies on the alanine dipeptide and the aspartate-arginine peptide demonstrate sampling improvement over the original one-dimension-EESRW strategy; with the same biasing level, the present generalization allows more effective acceleration of the sampling of conformational transitions in aqueous solution. The 2D-EESRW generalization is readily extended to higher dimension schemes and employed in more advanced enhanced-sampling schemes, such as the recent orthogonal space random walk method.

  12. Sampling Designs in Qualitative Research: Making the Sampling Process More Public

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.

    2007-01-01

    The purpose of this paper is to provide a typology of sampling designs for qualitative researchers. We introduce the following sampling strategies: (a) parallel sampling designs, which represent a body of sampling strategies that facilitate credible comparisons of two or more different subgroups that are extracted from the same levels of study;…

  13. The Relationship between Anxiety and Coping Strategies in Family Caregivers of Patients with Trauma.

    PubMed

    Rahnama, Mozhgan; Shahdadi, Hosien; Bagheri, Somyeh; Moghadam, Mahdieh Poodineh; Absalan, Ahmad

    2017-04-01

    Traumatic events are of high incidence and affect not only the patient but also their family members, causing psychological problems such as stress and anxiety for caregivers of these patients. Therefore, the application of appropriate coping strategies by them seems necessary in order to promote mental health. To study the relationship of anxiety with coping strategies in family caregivers of trauma patients. The present research was a descriptive-correlational study which was carried out on 127 family caregivers of patients with trauma in intensive care unit, surgery ward and emergency unit of Amir al-Mu'minin Hospital of Zabol, Sistan and Baluchestan Province. The respondents were selected based on the convenience sampling method. Demographics questionnaire, DASS-21, and Coping Strategies questionnaire were used for data collection. The obtained data were statistically analysed using descriptive statistics, Analysis of Variance (ANOVA), t-test, and Pearson correlation coefficient in statistical package for the Social Sciences (SPSS) version 21.0. Based on the results, 89.9% of family caregivers suffer from mild to severe anxiety. The most common type of coping strategy used by the respondents was emotion-focused. The results showed no relationship between anxiety and emotion-centrism, but an inverse relationship was found between problem-centrism and anxiety. The majority of family caregivers had anxiety. Given, the inverse relationship between the level of anxiety and the use of problem-based coping strategy, in addition to identifying and reducing the causes of anxiety in caregivers. It is recommended that appropriate coping strategies should be trained to them.

  14. The Relationship between Anxiety and Coping Strategies in Family Caregivers of Patients with Trauma

    PubMed Central

    Rahnama, Mozhgan; Bagheri, Somyeh; Moghadam, Mahdieh Poodineh; Absalan, Ahmad

    2017-01-01

    Introduction Traumatic events are of high incidence and affect not only the patient but also their family members, causing psychological problems such as stress and anxiety for caregivers of these patients. Therefore, the application of appropriate coping strategies by them seems necessary in order to promote mental health. Aim To study the relationship of anxiety with coping strategies in family caregivers of trauma patients. Materials and Methods The present research was a descriptive-correlational study which was carried out on 127 family caregivers of patients with trauma in intensive care unit, surgery ward and emergency unit of Amir al-Mu’minin Hospital of Zabol, Sistan and Baluchestan Province. The respondents were selected based on the convenience sampling method. Demographics questionnaire, DASS-21, and Coping Strategies questionnaire were used for data collection. The obtained data were statistically analysed using descriptive statistics, Analysis of Variance (ANOVA), t-test, and Pearson correlation coefficient in statistical package for the Social Sciences (SPSS) version 21.0. Results Based on the results, 89.9% of family caregivers suffer from mild to severe anxiety. The most common type of coping strategy used by the respondents was emotion-focused. The results showed no relationship between anxiety and emotion-centrism, but an inverse relationship was found between problem-centrism and anxiety. Conclusion The majority of family caregivers had anxiety. Given, the inverse relationship between the level of anxiety and the use of problem-based coping strategy, in addition to identifying and reducing the causes of anxiety in caregivers. It is recommended that appropriate coping strategies should be trained to them. PMID:28571166

  15. Modelling of in-stream nitrogen and phosphorus concentrations using different sampling strategies for calibration data

    NASA Astrophysics Data System (ADS)

    Jomaa, Seifeddine; Jiang, Sanyuan; Yang, Xiaoqiang; Rode, Michael

    2016-04-01

    It is known that a good evaluation and prediction of surface water pollution is mainly limited by the monitoring strategy and the capability of the hydrological water quality model to reproduce the internal processes. To this end, a compromise sampling frequency, which can reflect the dynamical behaviour of leached nutrient fluxes responding to changes in land use, agriculture practices and point sources, and appropriate process-based water quality model are required. The objective of this study was to test the identification of hydrological water quality model parameters (nitrogen and phosphorus) under two different monitoring strategies: (1) regular grab-sampling approach and (2) regular grab-sampling with additional monitoring during the hydrological events using automatic samplers. First, the semi-distributed hydrological water quality HYPE (Hydrological Predictions for the Environment) model was successfully calibrated (1994-1998) for discharge (NSE = 0.86), nitrate-N (lowest NSE for nitrate-N load = 0.69), particulate phosphorus and soluble phosphorus in the Selke catchment (463 km2, central Germany) for the period 1994-1998 using regular grab-sampling approach (biweekly to monthly for nitrogen and phosphorus concentrations). Second, the model was successfully validated during the period 1999-2010 for discharge, nitrate-N, particulate-phosphorus and soluble-phosphorus (lowest NSE for soluble phosphorus load = 0.54). Results, showed that when additional sampling during the events with random grab-sampling approach was used (period 2011-2013), the hydrological model could reproduce only the nitrate-N and soluble phosphorus concentrations reasonably well. However, when additional sampling during the hydrological events was considered, the HYPE model could not represent the measured particulate phosphorus. This reflects the importance of suspended sediment during the hydrological events increasing the concentrations of particulate phosphorus. The HYPE model could reproduce the total phosphorus during the period 2011-2013 only when the sediment transport-related model parameters was re-identified again considering the automatic sampling during the high-flow conditions.

  16. Evaluation of simple rapid HIV assays and development of national rapid HIV test algorithms in Dar es Salaam, Tanzania

    PubMed Central

    2009-01-01

    Background Suitable algorithms based on a combination of two or more simple rapid HIV assays have been shown to have a diagnostic accuracy comparable to double enzyme-linked immunosorbent assay (ELISA) or double ELISA with Western Blot strategies. The aims of this study were to evaluate the performance of five simple rapid HIV assays using whole blood samples from HIV-infected patients, pregnant women, voluntary counseling and testing attendees and blood donors, and to formulate an alternative confirmatory strategy based on rapid HIV testing algorithms suitable for use in Tanzania. Methods Five rapid HIV assays: Determine™ HIV-1/2 (Inverness Medical), SD Bioline HIV 1/2 3.0 (Standard Diagnostics Inc.), First Response HIV Card 1–2.0 (PMC Medical India Pvt Ltd), HIV1/2 Stat-Pak Dipstick (Chembio Diagnostic System, Inc) and Uni-Gold™ HIV-1/2 (Trinity Biotech) were evaluated between June and September 2006 using 1433 whole blood samples from hospital patients, pregnant women, voluntary counseling and testing attendees and blood donors. All samples that were reactive on all or any of the five rapid assays and 10% of non-reactive samples were tested on a confirmatory Inno-Lia HIV I/II immunoblot assay (Immunogenetics). Results Three hundred and ninety samples were confirmed HIV-1 antibody positive, while 1043 were HIV negative. The sensitivity at initial testing of Determine, SD Bioline and Uni-Gold™ was 100% (95% CI; 99.1–100) while First Response and Stat-Pak had sensitivity of 99.5% (95% CI; 98.2–99.9) and 97.7% (95% CI; 95.7–98.9), respectively, which increased to 100% (95% CI; 99.1–100) on repeat testing. The initial specificity of the Uni-Gold™ assay was 100% (95% CI; 99.6–100) while specificities were 99.6% (95% CI; 99–99.9), 99.4% (95% CI; 98.8–99.7), 99.6% (95% CI; 99–99.9) and 99.8% (95% CI; 99.3–99.9) for Determine, SD Bioline, First Response and Stat-Pak assays, respectively. There was no any sample which was concordantly false positive in Uni-Gold™, Determine and SD Bioline assays. Conclusion An alternative confirmatory HIV testing strategy based on initial testing on either SD Bioline or Determine assays followed by testing of reactive samples on the Determine or SD Bioline gave 100% sensitivity (95% CI; 99.1–100) and 100% specificity (95% CI; 96–99.1) with Uni-Gold™ as tiebreaker for discordant results. PMID:19226452

  17. Evaluation of simple rapid HIV assays and development of national rapid HIV test algorithms in Dar es Salaam, Tanzania.

    PubMed

    Lyamuya, Eligius F; Aboud, Said; Urassa, Willy K; Sufi, Jaffer; Mbwana, Judica; Ndugulile, Faustin; Massambu, Charles

    2009-02-18

    Suitable algorithms based on a combination of two or more simple rapid HIV assays have been shown to have a diagnostic accuracy comparable to double enzyme-linked immunosorbent assay (ELISA) or double ELISA with Western Blot strategies. The aims of this study were to evaluate the performance of five simple rapid HIV assays using whole blood samples from HIV-infected patients, pregnant women, voluntary counseling and testing attendees and blood donors, and to formulate an alternative confirmatory strategy based on rapid HIV testing algorithms suitable for use in Tanzania. Five rapid HIV assays: Determine HIV-1/2 (Inverness Medical), SD Bioline HIV 1/2 3.0 (Standard Diagnostics Inc.), First Response HIV Card 1-2.0 (PMC Medical India Pvt Ltd), HIV1/2 Stat-Pak Dipstick (Chembio Diagnostic System, Inc) and Uni-Gold HIV-1/2 (Trinity Biotech) were evaluated between June and September 2006 using 1433 whole blood samples from hospital patients, pregnant women, voluntary counseling and testing attendees and blood donors. All samples that were reactive on all or any of the five rapid assays and 10% of non-reactive samples were tested on a confirmatory Inno-Lia HIV I/II immunoblot assay (Immunogenetics). Three hundred and ninety samples were confirmed HIV-1 antibody positive, while 1043 were HIV negative. The sensitivity at initial testing of Determine, SD Bioline and Uni-Gold was 100% (95% CI; 99.1-100) while First Response and Stat-Pak had sensitivity of 99.5% (95% CI; 98.2-99.9) and 97.7% (95% CI; 95.7-98.9), respectively, which increased to 100% (95% CI; 99.1-100) on repeat testing. The initial specificity of the Uni-Gold assay was 100% (95% CI; 99.6-100) while specificities were 99.6% (95% CI; 99-99.9), 99.4% (95% CI; 98.8-99.7), 99.6% (95% CI; 99-99.9) and 99.8% (95% CI; 99.3-99.9) for Determine, SD Bioline, First Response and Stat-Pak assays, respectively. There was no any sample which was concordantly false positive in Uni-Gold, Determine and SD Bioline assays. An alternative confirmatory HIV testing strategy based on initial testing on either SD Bioline or Determine assays followed by testing of reactive samples on the Determine or SD Bioline gave 100% sensitivity (95% CI; 99.1-100) and 100% specificity (95% CI; 96-99.1) with Uni-Gold as tiebreaker for discordant results.

  18. Quantifying and correcting motion artifacts in MRI

    NASA Astrophysics Data System (ADS)

    Bones, Philip J.; Maclaren, Julian R.; Millane, Rick P.; Watts, Richard

    2006-08-01

    Patient motion during magnetic resonance imaging (MRI) can produce significant artifacts in a reconstructed image. Since measurements are made in the spatial frequency domain ('k-space'), rigid-body translational motion results in phase errors in the data samples while rotation causes location errors. A method is presented to detect and correct these errors via a modified sampling strategy, thereby achieving more accurate image reconstruction. The strategy involves sampling vertical and horizontal strips alternately in k-space and employs phase correlation within the overlapping segments to estimate translational motion. An extension, also based on correlation, is employed to estimate rotational motion. Results from simulations with computer-generated phantoms suggest that the algorithm is robust up to realistic noise levels. The work is being extended to physical phantoms. Provided that a reference image is available and the object is of limited extent, it is shown that a measure related to the amount of energy outside the support can be used to objectively compare the severity of motion-induced artifacts.

  19. The integrated quality assessment of Chinese commercial dry red wine based on a method of online HPLC-DAD-CL combined with HPLC-ESI-MS.

    PubMed

    Yu, Hai-Xiang; Sun, Li-Qiong; Qi, Jin

    2014-07-01

    To apply an integrated quality assessment strategy to investigate the quality of multiple Chinese commercial dry red wine samples. A comprehensive method was developed by combining a high performance liquid chromatography-diode array detector-chemiluminescence (HPLC-DAD-CL) online hyphenated system with an HPLC-ESI-MS technique. Chromatographic and H2O2-scavenging active fingerprints of thirteen batches of different, commercially available Chinese dry red wine samples were obtained and analyzed. Twenty-five compounds, including eighteen antioxidants were identified and evaluated. The dominant and characteristic antioxidants in the samples were identified. The relationships between antioxidant potency and the cultivated variety of grape, producing area, cellaring period, and trade mark are also discussed. The results provide the feasibility for an integrated quality assessment strategy to be efficiently and objectively used in quality (especially antioxidant activity) assessment and identification of dry red wine. Copyright © 2014 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.

  20. 77 FR 53941 - Emerging Global Advisors, LLC, et al.;

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ... sampling strategy to track its Underlying Index. A Fund using a replication strategy will invest in the.... A Fund using a representative sampling strategy will hold some, but not necessarily all, of the... strategy, a Fund is not expected to track the performance of its Underlying Index with the same degree of...

  1. New color-based tracking algorithm for joints of the upper extremities

    NASA Astrophysics Data System (ADS)

    Wu, Xiangping; Chow, Daniel H. K.; Zheng, Xiaoxiang

    2007-11-01

    To track the joints of the upper limb of stroke sufferers for rehabilitation assessment, a new tracking algorithm which utilizes a developed color-based particle filter and a novel strategy for handling occlusions is proposed in this paper. Objects are represented by their color histogram models and particle filter is introduced to track the objects within a probability framework. Kalman filter, as a local optimizer, is integrated into the sampling stage of the particle filter that steers samples to a region with high likelihood and therefore fewer samples is required. A color clustering method and anatomic constraints are used in dealing with occlusion problem. Compared with the general basic particle filtering method, the experimental results show that the new algorithm has reduced the number of samples and hence the computational consumption, and has achieved better abilities of handling complete occlusion over a few frames.

  2. SPR based immunosensor for detection of Legionella pneumophila in water samples

    NASA Astrophysics Data System (ADS)

    Enrico, De Lorenzis; Manera, Maria G.; Montagna, Giovanni; Cimaglia, Fabio; Chiesa, Maurizio; Poltronieri, Palmiro; Santino, Angelo; Rella, Roberto

    2013-05-01

    Detection of legionellae by water sampling is an important factor in epidemiological investigations of Legionnaires' disease and its prevention. To avoid labor-intensive problems with conventional methods, an alternative, highly sensitive and simple method is proposed for detecting L. pneumophila in aqueous samples. A compact Surface Plasmon Resonance (SPR) instrumentation prototype, provided with proper microfluidics tools, is built. The developed immunosensor is capable of dynamically following the binding between antigens and the corresponding antibody molecules immobilized on the SPR sensor surface. A proper immobilization strategy is used in this work that makes use of an important efficient step aimed at the orientation of antibodies onto the sensor surface. The feasibility of the integration of SPR-based biosensing setups with microfluidic technologies, resulting in a low-cost and portable biosensor is demonstrated.

  3. Effectiveness of Gross Model-Based Emotion Regulation Strategies Training on Anger Reduction in Drug-Dependent Individuals and its Sustainability in Follow-up.

    PubMed

    Massah, Omid; Sohrabi, Faramarz; A'azami, Yousef; Doostian, Younes; Farhoudian, Ali; Daneshmand, Reza

    2016-03-01

    Emotion plays an important role in adapting to life changes and stressful events. Difficulty regulating emotions is one of the problems drug abusers often face, and teaching these individuals to express and manage their emotions can be effective on improving their difficult circumstances. The present study aimed to determine the effectiveness of the Gross model-based emotion regulation strategies training on anger reduction in drug-dependent individuals. The present study had a quasi-experimental design wherein pretest-posttest evaluations were applied using a control group. The population under study included addicts attending Marivan's methadone maintenance therapy centers in 2012 - 2013. Convenience sampling was used to select 30 substance-dependent individuals undergoing maintenance treatment who were then randomly assigned to the experiment and control groups. The experiment group received its training in eight two-hour sessions. Data were analyzed using analysis of co-variance and paired t-test. There was significant reduction in anger symptoms of drug-dependent individuals after gross model based emotion regulation training (ERT) (P < 0.001). Moreover, the effectiveness of the training on anger was persistent in the follow-up period. Symptoms of anger in drug-dependent individuals of this study were reduced by gross model-based emotion regulation strategies training. Based on the results of this study, we may conclude that the gross model based emotion regulation strategies training can be applied alongside other therapies to treat drug abusers undergoing rehabilitation.

  4. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    NASA Astrophysics Data System (ADS)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  5. Exploring Business Strategy in Health Information Exchange Organizations.

    PubMed

    Langabeer, James R; Champagne, Tiffany

    2016-01-01

    Unlike consumer goods industries, healthcare has been slow to implement technolo gies that support exchange of data in patients' health records. This results in avoid able medication errors, avoidable hospital readmissions, unnecessary duplicate testing, and other inefficient or wasteful practices. Community-based regional health information exchange (HIE) organizations have evolved in response to federal aims to encourage interoperability, yet little is known about their strategic approach. We use the lens of institutional and strategic management theories to empirically explore the differences in business strategies deployed in HIEs that are, to date, financially sustainable versus those that are not. We developed a 20-question survey targeted to CEOs to assess HIE business strategies. Our sample consisted of 60 community-based exchanges distributed throughout the United States, and we achieved a 58% response rate. Questions centered on competitive strategy and financial sustainability. We relied on logistic regression methods to explore relationships between variables. Our regression identified characteristics common to sustainable organizations. We defined sustainability as revenues exceeding operational costs. Seventeen of the 35 organizations (49%) defined themselves as currently sustainable. Focus and cost leadership strategies were significantly associated with sustainability. Growth strate gies, which were much more common than other strategies, were not associated with sustainability. We saw little evidence of a differentiation strategy (i.e., the basis of competition whereby the attributes of a product or service are unmatched by rivals). Most CEOs had a relatively optimistic outlook, with 60% stating they were confident of surviving over the next 5 years; however, nearly 9% of the organizations were in some phase of divestiture or exit from the market. HIEs are evolving differently based on local leadership decisions, yet their strategic approach is isomorphic (or similar). Further insight into successful business strategies could help ensure the long-term survival of HIEs.

  6. Development and testing of an implementation strategy for a complex housing intervention: protocol for a mixed methods study.

    PubMed

    Watson, Dennis P; Young, Jeani; Ahonen, Emily; Xu, Huiping; Henderson, Macey; Shuman, Valery; Tolliver, Randi

    2014-10-17

    There is currently a lack of scientifically designed and tested implementation strategies. Such strategies are particularly important for highly complex interventions that require coordination between multiple parts to be successful. This paper presents a protocol for the development and testing of an implementation strategy for a complex intervention known as the Housing First model (HFM). Housing First is an evidence-based practice for chronically homeless individuals demonstrated to significantly improve a number of outcomes. Drawing on practices demonstrated to be useful in implementation and e-learning theory, our team is currently adapting a face-to-face implementation strategy so that it can be delivered over a distance. Research activities will be divided between Chicago and Central Indiana, two areas with significantly different barriers to HFM implementation. Ten housing providers (five from Chicago and five from Indiana) will be recruited to conduct an alpha test of each of four e-learning modules as they are developed. Providers will be requested to keep a detailed log of their experience completing the modules and participate in one of two focus groups. After refining the modules based on alpha test results, we will test the strategy among a sample of four housing organizations (two from Chicago and two from Indiana). We will collect and analyze both qualitative and quantitative data from administration and staff. Measures of interest include causal factors affecting implementation, training outcomes, and implementation outcomes. This project is an important first step in the development of an evidence-based implementation strategy to increase scalability and impact of the HFM. The project also has strong potential to increase limited scientific knowledge regarding implementation strategies in general.

  7. Recommended Mass Spectrometry-Based Strategies to Identify Ricin-Containing Samples

    PubMed Central

    Kalb, Suzanne R.; Schieltz, David M.; Becher, François; Astot, Crister; Fredriksson, Sten-Åke; Barr, John R.

    2015-01-01

    Ricin is a protein toxin produced by the castor bean plant (Ricinus communis) together with a related protein known as R. communis agglutinin (RCA120). Mass spectrometric (MS) assays have the capacity to unambiguously identify ricin and to detect ricin’s activity in samples with complex matrices. These qualitative and quantitative assays enable detection and differentiation of ricin from the less toxic RCA120 through determination of the amino acid sequence of the protein in question, and active ricin can be monitored by MS as the release of adenine from the depurination of a nucleic acid substrate. In this work, we describe the application of MS-based methods to detect, differentiate and quantify ricin and RCA120 in nine blinded samples supplied as part of the EQuATox proficiency test. Overall, MS-based assays successfully identified all samples containing ricin or RCA120 with the exception of the sample spiked with the lowest concentration (0.414 ng/mL). In fact, mass spectrometry was the most successful method for differentiation of ricin and RCA120 based on amino acid determination. Mass spectrometric methods were also successful at ranking the functional activities of the samples, successfully yielding semi-quantitative results. These results indicate that MS-based assays are excellent techniques to detect, differentiate, and quantify ricin and RCA120 in complex matrices. PMID:26610568

  8. Self-assembly of three-dimensional interconnected graphene-based aerogels and its application in supercapacitors.

    PubMed

    Ji, Chen-Chen; Xu, Mao-Wen; Bao, Shu-Juan; Cai, Chang-Jun; Lu, Zheng-Jiang; Chai, Hui; Yang, Fan; Wei, Hua

    2013-10-01

    Homogeneously distributed self-assembling hybrid graphene-based aerogels with 3D interconnected pores, employing three types of carbohydrates (glucose, β-cyclodextrin, and chitosan), have been fabricated by a simple hydrothermal route. Using three types of carbohydrates as morphology oriented agents and reductants can effectively tailor the microstructures, physical properties, and electrochemical performances of the products. The effects of different carbohydrates on graphene oxide reduction to form graphene-based aerogels with different microcosmic morphologies and physical properties were also systemically discussed. The electrochemical behaviors of all graphene-based aerogel samples showed remarkably strong and stable performances, which indicated that all the 3D interpenetrating microstructure graphene-based aerogel samples with well-developed porous nanostructures and interconnected conductive networks could provide fast ionic channels for electrochemical energy storage. These results demonstrate that this strategy would offer an easy and effective way to fabricate graphene-based materials. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. Classifier transfer with data selection strategies for online support vector machine classification with class imbalance

    NASA Astrophysics Data System (ADS)

    Krell, Mario Michael; Wilshusen, Nils; Seeland, Anett; Kim, Su Kyoung

    2017-04-01

    Objective. Classifier transfers usually come with dataset shifts. To overcome dataset shifts in practical applications, we consider the limitations in computational resources in this paper for the adaptation of batch learning algorithms, like the support vector machine (SVM). Approach. We focus on data selection strategies which limit the size of the stored training data by different inclusion, exclusion, and further dataset manipulation criteria like handling class imbalance with two new approaches. We provide a comparison of the strategies with linear SVMs on several synthetic datasets with different data shifts as well as on different transfer settings with electroencephalographic (EEG) data. Main results. For the synthetic data, adding only misclassified samples performed astoundingly well. Here, balancing criteria were very important when the other criteria were not well chosen. For the transfer setups, the results show that the best strategy depends on the intensity of the drift during the transfer. Adding all and removing the oldest samples results in the best performance, whereas for smaller drifts, it can be sufficient to only add samples near the decision boundary of the SVM which reduces processing resources. Significance. For brain-computer interfaces based on EEG data, models trained on data from a calibration session, a previous recording session, or even from a recording session with another subject are used. We show, that by using the right combination of data selection criteria, it is possible to adapt the SVM classifier to overcome the performance drop from the transfer.

  10. Gaze data reveal distinct choice processes underlying model-based and model-free reinforcement learning

    PubMed Central

    Konovalov, Arkady; Krajbich, Ian

    2016-01-01

    Organisms appear to learn and make decisions using different strategies known as model-free and model-based learning; the former is mere reinforcement of previously rewarded actions and the latter is a forward-looking strategy that involves evaluation of action-state transition probabilities. Prior work has used neural data to argue that both model-based and model-free learners implement a value comparison process at trial onset, but model-based learners assign more weight to forward-looking computations. Here using eye-tracking, we report evidence for a different interpretation of prior results: model-based subjects make their choices prior to trial onset. In contrast, model-free subjects tend to ignore model-based aspects of the task and instead seem to treat the decision problem as a simple comparison process between two differentially valued items, consistent with previous work on sequential-sampling models of decision making. These findings illustrate a problem with assuming that experimental subjects make their decisions at the same prescribed time. PMID:27511383

  11. Using Genotype Abundance to Improve Phylogenetic Inference

    PubMed Central

    Mesin, Luka; Victora, Gabriel D; Minin, Vladimir N; Matsen, Frederick A

    2018-01-01

    Abstract Modern biological techniques enable very dense genetic sampling of unfolding evolutionary histories, and thus frequently sample some genotypes multiple times. This motivates strategies to incorporate genotype abundance information in phylogenetic inference. In this article, we synthesize a stochastic process model with standard sequence-based phylogenetic optimality, and show that tree estimation is substantially improved by doing so. Our method is validated with extensive simulations and an experimental single-cell lineage tracing study of germinal center B cell receptor affinity maturation. PMID:29474671

  12. The impact of privacy protections on recruitment in a multicenter stroke genetics study

    PubMed Central

    Chen, D.T.; Worrall, B.B.; Brown, R.D.; Brott, T.G.; Kissela, B.M.; Olson, T.S.; Rich, S.S.; Meschia, J.F.

    2006-01-01

    The authors reviewed the recruitment of stroke-affected sibling pairs using a letter-based, proband-initiated contact strategy. The authors randomly sampled 99 proband enrollment forms (Phase 1) and randomly sampled 50 sibling reply cards (Phase 2). The sibling response rate was 30.6%, for a pedigree response rate of 58%. Of the siblings who replied, 96% authorized further contact. Median time from proband enrollment to pedigree DNA banking, which required 3+ probands, was 134 days. PMID:15728301

  13. A pre-classification strategy based on UPLC-Triple-TOF/MS for metabolic screening and identification of Radix glehniae in rats.

    PubMed

    Wang, Shuang; Qi, Pengcheng; Zhou, Na; Zhao, Minmin; Ding, Weijing; Li, Song; Liu, Minyan; Wang, Qiao; Jin, Shumin

    2016-10-01

    Traditional Chinese Medicines (TCMs) have gained increasing popularity in modern society. However, the profiles of TCMs in vivo are still unclear owing to their complexity and low level in vivo. In this study, UPLC-Triple-TOF techniques were employed for data acquiring, and a novel pre-classification strategy was developed to rapidly and systematically screen and identify the absorbed constituents and metabolites of TCMs in vivo using Radix glehniae as the research object. In this strategy, pre-classification for absorbed constituents was first performed according to the similarity of their structures. Then representative constituents were elected from every class and analyzed separately to screen non-target absorbed constituents and metabolites in biosamples. This pre-classification strategy is basing on target (known) constituents to screen non-target (unknown) constituents from the massive data acquired by mass spectrometry. Finally, the screened candidate compounds were interpreted and identified based on a predicted metabolic pathway, well - studied fragmentation rules, a predicted metabolic pathway, polarity and retention time of the compounds, and some related literature. With this method, a total of 111 absorbed constituents and metabolites of Radix glehniae in rats' urine, plasma, and bile samples were screened and identified or tentatively characterized successfully. This strategy provides an idea for the screening and identification of the metabolites of other TCMs.

  14. Characterization and Delivery of Hanford High-Level Radioactive Waste Slurry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thien, Michael G.; Denslow, Kayte M.; Lee, K. P.

    2014-11-15

    Two primary challenges to characterizing Hanford’s high-level radioactive waste slurry prior to transfer to a treatment facility are the ability to representatively sample million-gallon tanks and to estimate the critical velocity of the complex slurry. Washington River Protection Solutions has successfully demonstrated a sampling concept that minimizes sample errors by collecting multiple sample increments from a sample loop where the mixed tank contents are recirculated. Pacific Northwest National Laboratory has developed and demonstrated an ultrasonic-based Pulse-Echo detection device that is capable of detecting a stationary settled bed of solids in a pipe with flowing slurry. These two concepts are essentialmore » elements of a feed delivery strategy that drives the Hanford clean-up mission.« less

  15. When the management of nurse absenteeism becomes a cause of absenteeism: a study based on a comparison of two health care facilities.

    PubMed

    Damart, Sébastien; Kletz, Frédéric

    2016-01-01

    The study aims to explore perceptions of the causes of nurse absenteeism. Nurse absenteeism is rising in many countries. However, there is little evidence as to how strategies adopted in order to cushion the effects of absenteeism on workload influence absenteeism itself. The study used a 'qualitative' method based on cognitive mapping techniques in order to represent perceptions about absenteeism graphically. The study was conducted in two health-care facilities with a sample of 55 interviewees. Absenteeism is due in part to strategies adopted in order to cushion the effects of absenteeism on organisations. Furthermore, the strategies are self-legitimising. The more they are used, the more they are regarded as normal and useful. A plan to reduce absenteeism among nurses must explicitly take into account the strategies used to cushion its impact. To cushion the effects of absenteeism among caregivers, managers must make trade-offs that take into account, for example, the workload or disruption linked to the substitution of personnel. © 2014 John Wiley & Sons Ltd.

  16. A Comprehensive Strategy to Construct In-house Database for Accurate and Batch Identification of Small Molecular Metabolites.

    PubMed

    Zhao, Xinjie; Zeng, Zhongda; Chen, Aiming; Lu, Xin; Zhao, Chunxia; Hu, Chunxiu; Zhou, Lina; Liu, Xinyu; Wang, Xiaolin; Hou, Xiaoli; Ye, Yaorui; Xu, Guowang

    2018-05-29

    Identification of the metabolites is an essential step in metabolomics study to interpret regulatory mechanism of pathological and physiological processes. However, it is still a big headache in LC-MSn-based studies because of the complexity of mass spectrometry, chemical diversity of metabolites, and deficiency of standards database. In this work, a comprehensive strategy is developed for accurate and batch metabolite identification in non-targeted metabolomics studies. First, a well defined procedure was applied to generate reliable and standard LC-MS2 data including tR, MS1 and MS2 information at a standard operational procedure (SOP). An in-house database including about 2000 metabolites was constructed and used to identify the metabolites in non-targeted metabolic profiling by retention time calibration using internal standards, precursor ion alignment and ion fusion, auto-MS2 information extraction and selection, and database batch searching and scoring. As an application example, a pooled serum sample was analyzed to deliver the strategy, 202 metabolites were identified in the positive ion mode. It shows our strategy is useful for LC-MSn-based non-targeted metabolomics study.

  17. Compared analysis of different sampling strategies for the monitoring of pesticide contamination in streams

    NASA Astrophysics Data System (ADS)

    Liger, Lucie; Margoum, Christelle; Guillemain, Céline; Carluer, Nadia

    2014-05-01

    The implementation of the WFD (Water Framework Directive), requests European Union member states to achieve good qualitative and quantitative status of all water bodies in 2015. The monitoring of organic micropollutants such as pesticides is an essential step to assess the chemical and biological state of streams, to understand the reasons of degradation and to implement sound mitigation solutions in the watershed. In particular, the water sampling, which can be performed according to several strategies, has to be closely adapted to the experimental goals. In this study, we present and compare 3 different active sampling strategies: grab sampling, time-related and flow-dependent automatic samplings. In addition, the last two can be fractionated (i.e., several samples collected, and each one contained in a single bottle) or averaged (i.e., several samples mixed in the same bottle). Time-related samples allow the assessment of average exposure concentrations of organic micropollutants, whereas flow-dependent samples lead to average flux concentrations. The 3 sampling strategies were applied and compared during the monitoring of the pesticide contamination of a river located in a French vineyard watershed (the Morcille River, located 60 km north of Lyon, in the Beaujolais region). Data were collected between 2007 and 2011, during different seasons and for a range of hydrological events. The Morcille watershed is characterized by contrasted hydrological events with a very short-time response due to its small size (5 km²), steep slopes (20 to 28%) and highly permeable sandy soils. These features make it particularly difficult to monitor water quality, due to fast variations of pesticide concentrations depending on rain events. This comparative study is performed in 2 steps. At first, we compare the timestamps of each sample composing the weekly-averaged samples and those of the grab samples with hydrological data. This allows us to evaluate the efficiency of these 2 sampling strategies in the integration of flow variations and therefore pesticide concentration variations during the sampling campaign. In a second step, we use the fractionated samples data during flood events to calculate the concentrations of virtual averaged samples of the events. Different time or flow steps were used for the calculation, to assess their impact on the pesticide averaged-concentrations or fluxes. These analyses highlight the benefits and drawbacks of each sampling strategy. They show that the sampling strategy should be carefully chosen and designed depending on the final aim of the study and on the watershed characteristics (in particular its hydrological dynamics). This study may help to design future monitoring on water quality. Key Words: sampling strategies, surface water, concentration, flux, pesticides.

  18. Expert recommendations for implementing change (ERIC): protocol for a mixed methods study

    PubMed Central

    2014-01-01

    Background Identifying feasible and effective implementation strategies that are contextually appropriate is a challenge for researchers and implementers, exacerbated by the lack of conceptual clarity surrounding terms and definitions for implementation strategies, as well as a literature that provides imperfect guidance regarding how one might select strategies for a given healthcare quality improvement effort. In this study, we will engage an Expert Panel comprising implementation scientists and mental health clinical managers to: establish consensus on a common nomenclature for implementation strategy terms, definitions and categories; and develop recommendations to enhance the match between implementation strategies selected to facilitate the use of evidence-based programs and the context of certain service settings, in this case the U.S. Department of Veterans Affairs (VA) mental health services. Methods/Design This study will use purposive sampling to recruit an Expert Panel comprising implementation science experts and VA mental health clinical managers. A novel, four-stage sequential mixed methods design will be employed. During Stage 1, the Expert Panel will participate in a modified Delphi process in which a published taxonomy of implementation strategies will be used to establish consensus on terms and definitions for implementation strategies. In Stage 2, the panelists will complete a concept mapping task, which will yield conceptually distinct categories of implementation strategies as well as ratings of the feasibility and effectiveness of each strategy. Utilizing the common nomenclature developed in Stages 1 and 2, panelists will complete an innovative menu-based choice task in Stage 3 that involves matching implementation strategies to hypothetical implementation scenarios with varying contexts. This allows for quantitative characterizations of the relative necessity of each implementation strategy for a given scenario. In Stage 4, a live web-based facilitated expert recommendation process will be employed to establish expert recommendations about which implementations strategies are essential for each phase of implementation in each scenario. Discussion Using a novel method of selecting implementation strategies for use within specific contexts, this study contributes to our understanding of implementation science and practice by sharpening conceptual distinctions among a comprehensive collection of implementation strategies. PMID:24669765

  19. LORAKS Makes Better SENSE: Phase-Constrained Partial Fourier SENSE Reconstruction without Phase Calibration

    PubMed Central

    Kim, Tae Hyung; Setsompop, Kawin; Haldar, Justin P.

    2016-01-01

    Purpose Parallel imaging and partial Fourier acquisition are two classical approaches for accelerated MRI. Methods that combine these approaches often rely on prior knowledge of the image phase, but the need to obtain this prior information can place practical restrictions on the data acquisition strategy. In this work, we propose and evaluate SENSE-LORAKS, which enables combined parallel imaging and partial Fourier reconstruction without requiring prior phase information. Theory and Methods The proposed formulation is based on combining the classical SENSE model for parallel imaging data with the more recent LORAKS framework for MR image reconstruction using low-rank matrix modeling. Previous LORAKS-based methods have successfully enabled calibrationless partial Fourier parallel MRI reconstruction, but have been most successful with nonuniform sampling strategies that may be hard to implement for certain applications. By combining LORAKS with SENSE, we enable highly-accelerated partial Fourier MRI reconstruction for a broader range of sampling trajectories, including widely-used calibrationless uniformly-undersampled trajectories. Results Our empirical results with retrospectively undersampled datasets indicate that when SENSE-LORAKS reconstruction is combined with an appropriate k-space sampling trajectory, it can provide substantially better image quality at high-acceleration rates relative to existing state-of-the-art reconstruction approaches. Conclusion The SENSE-LORAKS framework provides promising new opportunities for highly-accelerated MRI. PMID:27037836

  20. A novel colorimetric aptasensor for ultrasensitive detection of cocaine based on the formation of three-way junction pockets on the surfaces of gold nanoparticles.

    PubMed

    Abnous, Khalil; Danesh, Noor Mohammad; Ramezani, Mohammad; Taghdisi, Seyed Mohammad; Emrani, Ahmad Sarreshtehdar

    2018-08-22

    Herein, a novel colorimetric aptasensor was introduced for detection of cocaine based on the formation of three-way junction pockets on the surfaces of gold nanoparticles (AuNPs) and the catalytic activity of the surfaces of AuNPs. Simplicity and detection of cocaine in a short time (only 35 min) are some of the unique features of the proposed sensing strategy. In the presence of cocaine, triple-fragment aptamer (TFA) forms on the surfaces of AuNPs, leading to a significant decrease of the catalytic activity of AuNPs and the color of samples remains yellow. In the absence of target, TFA does not form on the surfaces of AuNPs and 4-Nitrophenol, as a colorimetric agent, has more access to the surfaces of AuNPs, resulting in the reduction of 4-Nitrophenol and the color of sample changes from yellow to colorless. The sensing strategy showed good specificity, a limit of detection (LOD) of 440 pM and a dynamic range over 2-100 nM. The sensing method was also successfully applied to detect cocaine in spiked human serum samples with recovery of 94.71-98.63%. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. A pragmatic examination of active and passive recruitment methods to improve the reach of community lifestyle programs: The Talking Health Trial.

    PubMed

    Estabrooks, Paul; You, Wen; Hedrick, Valisa; Reinholt, Margaret; Dohm, Erin; Zoellner, Jamie

    2017-01-19

    A primary challenge for behavior change strategies is ensuring that interventions can be effective while also attracting a broad and representative sample of the target population. The purpose of this case-study was to report on (1) the reach of a randomized controlled trial targeting reduced sugary beverages, (2) potential participant characteristic differences based on active versus passive recruitment strategies, and (3) recruitment strategy cost. Demographic and recruitment information was obtained for 8 counties and for individuals screened for participation. Personnel activities and time were tracked. Costs were calculated and compared by active versus passive recruitment. Six-hundred and twenty, of 1,056 screened, individuals were eligible and 301enrolled (77% women; 90% white; mean income $21,981 ± 16,443). Eighty-two and 44% of those responding to passive and active methods, respectively, enrolled in the trial. However, active recruitment strategies yielded considerably more enrolled (active = 199; passive = 102) individuals. Passive recruitment strategies yielded a less representative sample in terms of gender (more women), education (higher), and income (higher; p's <0.05). The average cost of an actively recruited and enrolled participant was $278 compared to $117 for a passively recruited and enrolled participant. Though passive recruitment is more cost efficient it may reduce the reach of sugary drink reduction strategies in lower educated and economic residents in rural communities. Clinicaltrials.gov; ID: NCT02193009 , July 2014, retrospectively registered.

  2. Triplex DNA formation-mediated strand displacement reaction for highly sensitive fluorescent detection of melamine.

    PubMed

    Liu, Xiaojuan; Xu, Ningning; Gai, Panpan; Li, Feng

    2018-08-01

    Since melamine is a strong hazard to human health, the development of new methods for highly sensitive detection of melamine is highly desirable. Herein, a novel fluorescent biosensing strategy was designed for sensitive and selective melamine assay based on the recognition ability of abasic (AP) site in triplex towards melamine and signal amplification by Mg 2+ -dependent DNAzyme. In this strategy, the melamine-induced formation of triplex DNA was employed to trigger the strand displacement reaction (SDR). The SDR process converted the specific target recognition into the release and activation of Mg 2+ -dependent DNAzyme, which could catalyze the cleavage of fluorophore/quencher labeled DNA substrate (FQ), resulting in a significantly increased fluorescent signal. Under the optimal conditions, the fluorescent signal has a linear relationship with the logarithm of the melamine concentration in a wide range of 0.005-50 μM. The detection limit was estimated to be 0.9 nM (0.1ppb), which is sufficiently sensitive for practical application. Furthermore, this strategy exhibits high selectivity against other potential interfering substances, and the practical application of this strategy for milk samples reveals that the proposed strategy works well for melamine assay in real samples. Therefore, this strategy presents a new method for the sensitive melamine assay and holds great promise for sensing applications in the environment and the food safety field. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Contemporary Sex-Based Differences by Age in Presenting Characteristics, Use of an Early Invasive Strategy, and Inhospital Mortality in Patients With Non-ST-Segment-Elevation Myocardial Infarction in the United States.

    PubMed

    Gupta, Tanush; Kolte, Dhaval; Khera, Sahil; Agarwal, Nayan; Villablanca, Pedro A; Goel, Kashish; Patel, Kavisha; Aronow, Wilbert S; Wiley, Jose; Bortnick, Anna E; Aronow, Herbert D; Abbott, J Dawn; Pyo, Robert T; Panza, Julio A; Menegus, Mark A; Rihal, Charanjit S; Fonarow, Gregg C; Garcia, Mario J; Bhatt, Deepak L

    2018-01-01

    Prior studies have reported higher inhospital mortality in women versus men with non-ST-segment-elevation myocardial infarction. Whether this is because of worse baseline risk profile compared with men or sex-based disparities in treatment is not completely understood. We queried the 2003 to 2014 National Inpatient Sample databases to identify all hospitalizations in patients aged ≥18 years with the principal diagnosis of non-ST-segment-elevation myocardial infarction. Complex samples multivariable logistic regression models were used to examine sex differences in use of an early invasive strategy and inhospital mortality. Of 4 765 739 patients with non-ST-segment-elevation myocardial infarction, 2 026 285 (42.5%) were women. Women were on average 6 years older than men and had a higher comorbidity burden. Women were less likely to be treated with an early invasive strategy (29.4% versus 39.2%; adjusted odds ratio, 0.92; 95% confidence interval, 0.91-0.94). Women had higher crude inhospital mortality than men (4.7% versus 3.9%; unadjusted odds ratio, 1.22; 95% confidence interval, 1.20-1.25). After adjustment for age (adjusted odds ratio, 0.96; 95% confidence interval, 0.94-0.98) and additionally for comorbidities, other demographics, and hospital characteristics, women had 10% lower odds of inhospital mortality (adjusted odds ratio, 0.90; 95% confidence interval, 0.89-0.92). Further adjustment for differences in the use of an early invasive strategy did not change the association between female sex and lower risk-adjusted inhospital mortality. Although women were less likely to be treated with an early invasive strategy compared with men, the lower use of an early invasive strategy was not responsible for the higher crude inhospital mortality in women, which could be entirely explained by older age and higher comorbidity burden. © 2018 American Heart Association, Inc.

  4. Effects of Higher-Order Cognitive Strategy Training on Gist-Reasoning and Fact-Learning in Adolescents

    PubMed Central

    Gamino, Jacquelyn F.; Chapman, Sandra B.; Hull, Elizabeth L.; Lyon, G. Reid

    2010-01-01

    Improving the reasoning skills of adolescents across the United States has become a major concern for educators and scientists who are dedicated to identifying evidence-based protocols to improve student outcome. This small sample randomized, control pilot study sought to determine the efficacy of higher-order cognitive training on gist-reasoning and fact-learning in an inner-city public middle school. The study compared gist-reasoning and fact-learning performances after training in a smaller sample when tested in Spanish, many of the students’ native language, versus English. The 54 eighth grade students who participated in this pilot study were enroled in an urban middle school, predominantly from lower socio-economic status families, and were primarily of minority descent. The students were randomized into one of three groups, one that learned cognitive strategies promoting abstraction of meaning, a group that learned rote memory strategies, or a control group to ascertain the impact of each program on gist-reasoning and fact-learning from text-based information. We found that the students who had cognitive strategy instruction that entailed abstraction of meaning significantly improved their gist-reasoning and fact-learning ability. The students who learned rote memory strategies significantly improved their fact-learning scores from a text but not gist-reasoning ability. The control group showed no significant change in either gist-reasoning or fact-learning ability. A trend toward significant improvement in overall reading scores for the group that learned to abstract meaning as well as a significant correlation between gist-reasoning ability and the critical thinking on a state-mandated standardized reading test was also found. There were no significant differences between English and Spanish performance of gist-reasoning and fact-learning. Our findings suggest that teaching higher-order cognitive strategies facilitates gist-reasoning ability and student learning. PMID:21833248

  5. Rule-Based Category Learning in Children: The Role of Age and Executive Functioning

    PubMed Central

    Rabi, Rahel; Minda, John Paul

    2014-01-01

    Rule-based category learning was examined in 4–11 year-olds and adults. Participants were asked to learn a set of novel perceptual categories in a classification learning task. Categorization performance improved with age, with younger children showing the strongest rule-based deficit relative to older children and adults. Model-based analyses provided insight regarding the type of strategy being used to solve the categorization task, demonstrating that the use of the task appropriate strategy increased with age. When children and adults who identified the correct categorization rule were compared, the performance deficit was no longer evident. Executive functions were also measured. While both working memory and inhibitory control were related to rule-based categorization and improved with age, working memory specifically was found to marginally mediate the age-related improvements in categorization. When analyses focused only on the sample of children, results showed that working memory ability and inhibitory control were associated with categorization performance and strategy use. The current findings track changes in categorization performance across childhood, demonstrating at which points performance begins to mature and resemble that of adults. Additionally, findings highlight the potential role that working memory and inhibitory control may play in rule-based category learning. PMID:24489658

  6. A magnetic bead-based method for concentrating DNA from human urine for downstream detection.

    PubMed

    Bordelon, Hali; Russ, Patricia K; Wright, David W; Haselton, Frederick R

    2013-01-01

    Due to the presence of PCR inhibitors, PCR cannot be used directly on most clinical samples, including human urine, without pre-treatment. A magnetic bead-based strategy is one potential method to collect biomarkers from urine samples and separate the biomarkers from PCR inhibitors. In this report, a 1 mL urine sample was mixed within the bulb of a transfer pipette containing lyophilized nucleic acid-silica adsorption buffer and silica-coated magnetic beads. After mixing, the sample was transferred from the pipette bulb to a small diameter tube, and captured biomarkers were concentrated using magnetic entrainment of beads through pre-arrayed wash solutions separated by small air gaps. Feasibility was tested using synthetic segments of the 140 bp tuberculosis IS6110 DNA sequence spiked into pooled human urine samples. DNA recovery was evaluated by qPCR. Despite the presence of spiked DNA, no DNA was detectable in unextracted urine samples, presumably due to the presence of PCR inhibitors. However, following extraction with the magnetic bead-based method, we found that ∼50% of spiked TB DNA was recovered from human urine containing roughly 5×10(3) to 5×10(8) copies of IS6110 DNA. In addition, the DNA was concentrated approximately ten-fold into water. The final concentration of DNA in the eluate was 5×10(6), 14×10(6), and 8×10(6) copies/µL for 1, 3, and 5 mL urine samples, respectively. Lyophilized and freshly prepared reagents within the transfer pipette produced similar results, suggesting that long-term storage without refrigeration is possible. DNA recovery increased with the length of the spiked DNA segments from 10±0.9% for a 75 bp DNA sequence to 42±4% for a 100 bp segment and 58±9% for a 140 bp segment. The estimated LOD was 77 copies of DNA/µL of urine. The strategy presented here provides a simple means to achieve high nucleic acid recovery from easily obtained urine samples, which does not contain inhibitors of PCR.

  7. A Magnetic Bead-Based Method for Concentrating DNA from Human Urine for Downstream Detection

    PubMed Central

    Bordelon, Hali; Russ, Patricia K.; Wright, David W.; Haselton, Frederick R.

    2013-01-01

    Due to the presence of PCR inhibitors, PCR cannot be used directly on most clinical samples, including human urine, without pre-treatment. A magnetic bead-based strategy is one potential method to collect biomarkers from urine samples and separate the biomarkers from PCR inhibitors. In this report, a 1 mL urine sample was mixed within the bulb of a transfer pipette containing lyophilized nucleic acid-silica adsorption buffer and silica-coated magnetic beads. After mixing, the sample was transferred from the pipette bulb to a small diameter tube, and captured biomarkers were concentrated using magnetic entrainment of beads through pre-arrayed wash solutions separated by small air gaps. Feasibility was tested using synthetic segments of the 140 bp tuberculosis IS6110 DNA sequence spiked into pooled human urine samples. DNA recovery was evaluated by qPCR. Despite the presence of spiked DNA, no DNA was detectable in unextracted urine samples, presumably due to the presence of PCR inhibitors. However, following extraction with the magnetic bead-based method, we found that ∼50% of spiked TB DNA was recovered from human urine containing roughly 5×103 to 5×108 copies of IS6110 DNA. In addition, the DNA was concentrated approximately ten-fold into water. The final concentration of DNA in the eluate was 5×106, 14×106, and 8×106 copies/µL for 1, 3, and 5 mL urine samples, respectively. Lyophilized and freshly prepared reagents within the transfer pipette produced similar results, suggesting that long-term storage without refrigeration is possible. DNA recovery increased with the length of the spiked DNA segments from 10±0.9% for a 75 bp DNA sequence to 42±4% for a 100 bp segment and 58±9% for a 140 bp segment. The estimated LOD was 77 copies of DNA/µL of urine. The strategy presented here provides a simple means to achieve high nucleic acid recovery from easily obtained urine samples, which does not contain inhibitors of PCR. PMID:23861895

  8. Accurate Sample Assignment in a Multiplexed, Ultrasensitive, High-Throughput Sequencing Assay for Minimal Residual Disease.

    PubMed

    Bartram, Jack; Mountjoy, Edward; Brooks, Tony; Hancock, Jeremy; Williamson, Helen; Wright, Gary; Moppett, John; Goulden, Nick; Hubank, Mike

    2016-07-01

    High-throughput sequencing (HTS) (next-generation sequencing) of the rearranged Ig and T-cell receptor genes promises to be less expensive and more sensitive than current methods of monitoring minimal residual disease (MRD) in patients with acute lymphoblastic leukemia. However, the adoption of new approaches by clinical laboratories requires careful evaluation of all potential sources of error and the development of strategies to ensure the highest accuracy. Timely and efficient clinical use of HTS platforms will depend on combining multiple samples (multiplexing) in each sequencing run. Here we examine the Ig heavy-chain gene HTS on the Illumina MiSeq platform for MRD. We identify errors associated with multiplexing that could potentially impact the accuracy of MRD analysis. We optimize a strategy that combines high-purity, sequence-optimized oligonucleotides, dual indexing, and an error-aware demultiplexing approach to minimize errors and maximize sensitivity. We present a probability-based, demultiplexing pipeline Error-Aware Demultiplexer that is suitable for all MiSeq strategies and accurately assigns samples to the correct identifier without excessive loss of data. Finally, using controls quantified by digital PCR, we show that HTS-MRD can accurately detect as few as 1 in 10(6) copies of specific leukemic MRD. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  9. Evaluation of spot and passive sampling for monitoring, flux estimation and risk assessment of pesticides within the constraints of a typical regulatory monitoring scheme.

    PubMed

    Zhang, Zulin; Troldborg, Mads; Yates, Kyari; Osprey, Mark; Kerr, Christine; Hallett, Paul D; Baggaley, Nikki; Rhind, Stewart M; Dawson, Julian J C; Hough, Rupert L

    2016-11-01

    In many agricultural catchments of Europe and North America, pesticides occur at generally low concentrations with significant temporal variation. This poses several challenges for both monitoring and understanding ecological risks/impacts of these chemicals. This study aimed to compare the performance of passive and spot sampling strategies given the constraints of typical regulatory monitoring. Nine pesticides were investigated in a river currently undergoing regulatory monitoring (River Ugie, Scotland). Within this regulatory framework, spot and passive sampling were undertaken to understand spatiotemporal occurrence, mass loads and ecological risks. All the target pesticides were detected in water by both sampling strategies. Chlorotoluron was observed to be the dominant pesticide by both spot (maximum: 111.8ng/l, mean: 9.35ng/l) and passive sampling (maximum: 39.24ng/l, mean: 4.76ng/l). The annual pesticide loads were estimated to be 2735g and 1837g based on the spot and passive sampling data, respectively. The spatiotemporal trend suggested that agricultural activities were the primary source of the compounds with variability in loads explained in large by timing of pesticide applications and rainfall. The risk assessment showed chlorotoluron and chlorpyrifos posed the highest ecological risks with 23% of the chlorotoluron spot samples and 36% of the chlorpyrifos passive samples resulting in a Risk Quotient greater than 0.1. This suggests that mitigation measures might need to be taken to reduce the input of pesticides into the river. The overall comparison of the two sampling strategies supported the hypothesis that passive sampling tends to integrate the contaminants over a period of exposure and allows quantification of contamination at low concentration. The results suggested that within a regulatory monitoring context passive sampling was more suitable for flux estimation and risk assessment of trace contaminants which cannot be diagnosed by spot sampling and for determining if long-term average concentrations comply with specified standards. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Using intervention mapping for the development of a targeted secure web-based outreach strategy named SafeFriend, for Chlamydia trachomatis testing in young people at risk

    PubMed Central

    2013-01-01

    Background Many young people at high risk for Chlamydia trachomatis (Ct) are not reached by current sexual health care systems, such as general practitioners and public sexual health care centres (sexually transmitted infection clinics).Ct is the most frequently diagnosed bacterial sexually transmitted infection (STI) among sexually active people and in particular young heterosexuals. Innovative screening strategies are needed to interrupt the transmission of Ct among young people and connect the hidden cases to care. Methods Intervention Mapping (IM), a systematic approach to develop theory- and evidence-based interventions, was used to develop a strategy to target Ct testing towards young people who are currently hidden to care in The Netherlands. Both clinical users (i.e. sexual health care nurses) and public users (i.e., young people at risk for Ct) were closely involved in the IM process. A needs assessment study was carried out using semi-structured interviews among users (N = 21), a literature search and by taking lessons learned from existing screening programmes. Theoretical methods and practical applications to reach high risk young people and influence testing were selected and translated into specific programme components. Results The IM approach resulted in the development of a secure and web-based outreach Ct screening strategy, named SafeFriend. It is developed to target groups of high-risk young people who are currently hidden to care. Key methods include web-based Respondent Driven Sampling, starting from young Ct positive sexual health care centre clients, to reach and motivate peers (i.e., sex partners and friends) to get tested for Ct. Testing and the motivation of peers were proposed as the desired behavioural outcomes and the Precaution Adoption Process Model was chosen as theoretical framework. End users, i.e., young people and sexual health care nurses were interviewed and included in the development process to increase the success of implementation. Conclusions IM proved useful to develop an intervention for targeted Ct testing among young people. We believe this to be the first web-based outreach screening strategy which combines chain referral sampling with the delivery of targeted Ct testing to high risk young people within their sexual and social networks. PMID:24148656

  11. Using intervention mapping for the development of a targeted secure web-based outreach strategy named SafeFriend, for Chlamydia trachomatis testing in young people at risk.

    PubMed

    Theunissen, Kevin A T M; Hoebe, Christian J P A; Crutzen, Rik; Kara-Zaïtri, Chakib; de Vries, Nanne K; van Bergen, Jan E A M; van der Sande, Marianne A B; Dukers-Muijrers, Nicole H T M

    2013-10-22

    Many young people at high risk for Chlamydia trachomatis (Ct) are not reached by current sexual health care systems, such as general practitioners and public sexual health care centres (sexually transmitted infection clinics).Ct is the most frequently diagnosed bacterial sexually transmitted infection (STI) among sexually active people and in particular young heterosexuals. Innovative screening strategies are needed to interrupt the transmission of Ct among young people and connect the hidden cases to care. Intervention Mapping (IM), a systematic approach to develop theory- and evidence-based interventions, was used to develop a strategy to target Ct testing towards young people who are currently hidden to care in The Netherlands. Both clinical users (i.e. sexual health care nurses) and public users (i.e., young people at risk for Ct) were closely involved in the IM process. A needs assessment study was carried out using semi-structured interviews among users (N = 21), a literature search and by taking lessons learned from existing screening programmes. Theoretical methods and practical applications to reach high risk young people and influence testing were selected and translated into specific programme components. The IM approach resulted in the development of a secure and web-based outreach Ct screening strategy, named SafeFriend. It is developed to target groups of high-risk young people who are currently hidden to care. Key methods include web-based Respondent Driven Sampling, starting from young Ct positive sexual health care centre clients, to reach and motivate peers (i.e., sex partners and friends) to get tested for Ct. Testing and the motivation of peers were proposed as the desired behavioural outcomes and the Precaution Adoption Process Model was chosen as theoretical framework. End users, i.e., young people and sexual health care nurses were interviewed and included in the development process to increase the success of implementation. IM proved useful to develop an intervention for targeted Ct testing among young people. We believe this to be the first web-based outreach screening strategy which combines chain referral sampling with the delivery of targeted Ct testing to high risk young people within their sexual and social networks.

  12. Applying Active Learning to Assertion Classification of Concepts in Clinical Text

    PubMed Central

    Chen, Yukun; Mani, Subramani; Xu, Hua

    2012-01-01

    Supervised machine learning methods for clinical natural language processing (NLP) research require a large number of annotated samples, which are very expensive to build because of the involvement of physicians. Active learning, an approach that actively samples from a large pool, provides an alternative solution. Its major goal in classification is to reduce the annotation effort while maintaining the quality of the predictive model. However, few studies have investigated its uses in clinical NLP. This paper reports an application of active learning to a clinical text classification task: to determine the assertion status of clinical concepts. The annotated corpus for the assertion classification task in the 2010 i2b2/VA Clinical NLP Challenge was used in this study. We implemented several existing and newly developed active learning algorithms and assessed their uses. The outcome is reported in the global ALC score, based on the Area under the average Learning Curve of the AUC (Area Under the Curve) score. Results showed that when the same number of annotated samples was used, active learning strategies could generate better classification models (best ALC – 0.7715) than the passive learning method (random sampling) (ALC – 0.7411). Moreover, to achieve the same classification performance, active learning strategies required fewer samples than the random sampling method. For example, to achieve an AUC of 0.79, the random sampling method used 32 samples, while our best active learning algorithm required only 12 samples, a reduction of 62.5% in manual annotation effort. PMID:22127105

  13. A typology of pain coping strategies in pediatric patients with chronic abdominal pain.

    PubMed

    Walker, Lynn S; Baber, Kari Freeman; Garber, Judy; Smith, Craig A

    2008-07-15

    This study aimed to identify clinically meaningful profiles of pain coping strategies used by youth with chronic abdominal pain (CAP). Participants (n=699) were pediatric patients (ages 8-18 years) and their parents. Patients completed the Pain Response Inventory (PRI) and measures of somatic and depressive symptoms, disability, pain severity and pain efficacy, and perceived competence. Parents rated their children's pain severity and coping efficacy. Hierarchical cluster analysis based on the 13 PRI subscales identified pain coping profiles in Sample 1 (n=311) that replicated in Sample 2 (n=388). Evidence was found of external validity and distinctiveness of the profiles. The findings support a typology of pain coping that reflects the quality of patients' pain mastery efforts and interpersonal relationships associated with pain coping. Results are discussed in relation to developmental processes, attachment styles, and treatment implications.

  14. Adaptive Test Schemes for Control of Paratuberculosis in Dairy Cows.

    PubMed

    Kirkeby, Carsten; Græsbøll, Kaare; Nielsen, Søren Saxmose; Christiansen, Lasse Engbo; Toft, Nils; Halasa, Tariq

    2016-01-01

    Paratuberculosis is a chronic infection that in dairy cattle causes reduced milk yield, weight loss, and ultimately fatal diarrhea. Subclinical animals can excrete bacteria (Mycobacterium avium ssp. paratuberculosis, MAP) in feces and infect other animals. Farmers identify the infectious animals through a variety of test-strategies, but are challenged by the lack of perfect tests. Frequent testing increases the sensitivity but the costs of testing are a cause of concern for farmers. Here, we used a herd simulation model using milk ELISA tests to evaluate the epidemiological and economic consequences of continuously adapting the sampling interval in response to the estimated true prevalence in the herd. The key results were that the true prevalence was greatly affected by the hygiene level and to some extent by the test-frequency. Furthermore, the choice of prevalence that will be tolerated in a control scenario had a major impact on the true prevalence in the normal hygiene setting, but less so when the hygiene was poor. The net revenue is not greatly affected by the test-strategy, because of the general variation in net revenues between farms. An exception to this is the low hygiene herd, where frequent testing results in lower revenue. When we look at the probability of eradication, then it is correlated with the testing frequency and the target prevalence during the control phase. The probability of eradication is low in the low hygiene herd, and a test-and-cull strategy should probably not be the primary strategy in this herd. Based on this study we suggest that, in order to control MAP, the standard Danish dairy farm should use an adaptive strategy where a short sampling interval of three months is used when the estimated true prevalence is above 1%, and otherwise use a long sampling interval of one year.

  15. A randomized controlled trial of an HIV/AIDS Symptom Management Manual for depressive symptoms.

    PubMed

    Eller, Lucille S; Kirksey, Kenn M; Nicholas, Patrice K; Corless, Inge B; Holzemer, William L; Wantland, Dean J; Willard, Suzanne S; Robinson, Linda; Hamilton, Mary Jane; Sefcik, Elizabeth F; Moezzi, Shahnaz; Mendez, Marta Rivero; Rosa, Maria; Human, Sarie

    2013-01-01

    Abstract Depressive symptoms are highly prevalent, underdiagnosed, and undertreated in people living with HIV/AIDS (PLWH), and are associated with poorer health outcomes. This randomized controlled trial examined the effects of the HIV/AIDS Symptom Management Manual self-care symptom management strategies compared with a nutrition manual on depressive symptoms in an international sample of PLWH. The sample consisted of a sub-group (N=222) of participants in a larger study symptom management study who reported depressive symptoms. Depressive symptoms of the intervention (n=124) and control (n=98) groups were compared over three months: baseline, one-month, and two-months. Use and effectiveness of specific strategies were examined. Depressive symptom frequency at baseline varied significantly by country (χ (2) 12.9; p=0.04). Within the intervention group there were significant differences across time in depressive symptom frequency [F(2, 207) = 3.27, p=0.05], intensity [F(2, 91) = 4.6, p=0.01], and impact [F(2, 252) = 2.92, p= 0.05), and these were significantly lower at one month but not at two months, suggesting that self-care strategies are effective in reducing depressive symptoms, however effects may be short term. Most used and most effective self-care strategies were distraction techniques and prayer. This study suggests that people living with HIV can be taught and will employ self-care strategies for management of depressive symptoms and that these strategies are effective in reducing these symptoms. Self-care strategies are noninvasive, have no side-effects, and can be readily taught as an adjunct to other forms of treatment. Studies are needed to identify the most effective self-care strategies and quantify optimum dose and frequency of use as a basis for evidence-based practice.

  16. Population-based validation of a German version of the Brief Resilience Scale

    PubMed Central

    Wenzel, Mario; Stieglitz, Rolf-Dieter; Kunzler, Angela; Bagusat, Christiana; Helmreich, Isabella; Gerlicher, Anna; Kampa, Miriam; Kubiak, Thomas; Kalisch, Raffael; Lieb, Klaus; Tüscher, Oliver

    2018-01-01

    Smith and colleagues developed the Brief Resilience Scale (BRS) to assess the individual ability to recover from stress despite significant adversity. This study aimed to validate the German version of the BRS. We used data from a population-based (sample 1: n = 1.481) and a representative (sample 2: n = 1.128) sample of participants from the German general population (age ≥ 18) to assess reliability and validity. Confirmatory factor analyses (CFA) were conducted to compare one- and two-factorial models from previous studies with a method-factor model which especially accounts for the wording of the items. Reliability was analyzed. Convergent validity was measured by correlating BRS scores with mental health measures, coping, social support, and optimism. Reliability was good (α = .85, ω = .85 for both samples). The method-factor model showed excellent model fit (sample 1: χ2/df = 7.544; RMSEA = .07; CFI = .99; SRMR = .02; sample 2: χ2/df = 1.166; RMSEA = .01; CFI = 1.00; SRMR = .01) which was significantly better than the one-factor model (Δχ2(4) = 172.71, p < .001) or the two-factor model (Δχ2(3) = 31.16, p < .001). The BRS was positively correlated with well-being, social support, optimism, and the coping strategies active coping, positive reframing, acceptance, and humor. It was negatively correlated with somatic symptoms, anxiety and insomnia, social dysfunction, depression, and the coping strategies religion, denial, venting, substance use, and self-blame. To conclude, our results provide evidence for the reliability and validity of the German adaptation of the BRS as well as the unidimensional structure of the scale once method effects are accounted for. PMID:29438435

  17. Determining best practices in reconnoitering sites for habitability potential on Mars using a semi-autonomous rover: A GeoHeuristic Operational Strategies Test

    PubMed Central

    Yingst, R.A.; Berger, J.; Cohen, B.A.; Hynek, B.; Schmidt, M.E.

    2017-01-01

    We tested science operations strategies developed for use in remote mobile spacecraft missions, to determine whether reconnoitering a site of potential habitability prior to in-depth study (a walkabout-first strategy) can be a more efficient use of time and resources than the linear approach commonly used by planetary rover missions. Two field teams studied a sedimentary sequence in Utah to assess habitability potential. At each site one team commanded a human “rover” to execute observations and conducted data analysis and made follow-on decisions based solely on those observations. Another team followed the same traverse using traditional terrestrial field methods, and the results of the two teams were compared. Test results indicate that for a mission with goals similar to our field case, the walkabout-first strategy may save time and other mission resources, while improving science return. The approach enabled more informed choices and higher team confidence in choosing where to spend time and other consumable resources. The walkabout strategy may prove most efficient when many close sites must be triaged to a smaller subset for detailed study or sampling. This situation would arise when mission goals include finding, identifying, characterizing or sampling a specific material, feature or type of environment within a certain area. PMID:29307922

  18. Sampling stored product insect pests: a comparison of four statistical sampling models for probability of pest detection

    USDA-ARS?s Scientific Manuscript database

    Statistically robust sampling strategies form an integral component of grain storage and handling activities throughout the world. Developing sampling strategies to target biological pests such as insects in stored grain is inherently difficult due to species biology and behavioral characteristics. ...

  19. Prevalence, sociodemographic factors, psychological distress, and coping strategies related to compulsive buying: a cross sectional study in Galicia, Spain.

    PubMed

    Otero-López, José Manuel; Villardefrancos, Estíbaliz

    2014-04-05

    Compulsive buying has become a serious problem affecting a growing number of people in contemporary consumer societies. Nevertheless, research examining its prevalence in representative samples from the general population is still scarce and mainly focused on the exploration of sociodemographic factors, neglecting other aspects like psychological distress and coping styles. Therefore, this study intends to contribute to the cumulative knowledge by assessing compulsive buying prevalence in a representative sample from the general population in the region of Galicia, in Spain. Sociodemographic determinants, psychological symptoms, and coping strategies are also analyzed to clarify their role in this phenomenon. A random routes procedure was employed in the recruitment of the sample which was comprised of 2159 participants who were classified as either compulsive buyers or non-compulsive buyers. Both groups were compared regarding sociodemographic determinants, symptoms, and coping strategies through chi-square tests or analyses of variance. A multivariate logistic regression analysis was conducted to determine which of these determinants might play a part in the make up of a risk profile for compulsive buying. Estimated prevalence of compulsive buying was 7.1%. Compulsive buyers and non-compulsive buyers differed significantly in sex and age, with women and younger people showing a higher propensity for this phenomenon. Individuals with compulsive buying presented significantly higher scores on all the psychological symptoms considered. They also employed passive-avoidance coping strategies much more frequently and active strategies of problem solving and cognitive restructuring much less frequently. The logistic regression analysis results confirmed that being female, experiencing symptoms of anxiety, depression, and obsession-compulsion, and employing the passive-avoidance coping strategies of problem avoidance, wishful thinking, and self-criticism, all constituted risk factors for compulsive buying, whilst the increased age and the use of the active coping strategies of problem solving and cognitive restructuring were protection factors. Our findings revealed a substantial prevalence of compulsive buying. Additionally, the relevance of sociodemographic determinants, psychological distress, and coping strategies in this problem was confirmed. The establishment of a risk profile for compulsive buying based on these different sets of determinants would likely contribute to the development of more effective intervention programs.

  20. Development and applicability of a ready-to-use PCR system for GMO screening.

    PubMed

    Rosa, Sabrina F; Gatto, Francesco; Angers-Loustau, Alexandre; Petrillo, Mauro; Kreysa, Joachim; Querci, Maddalena

    2016-06-15

    With the growing number of GMOs introduced to the market, testing laboratories have seen their workload increase significantly. Ready-to-use multi-target PCR-based detection systems, such as pre-spotted plates (PSP), reduce analysis time while increasing capacity. This paper describes the development and applicability to GMO testing of a screening strategy involving a PSP and its associated web-based Decision Support System. The screening PSP was developed to detect all GMOs authorized in the EU in one single PCR experiment, through the combination of 16 validated assays. The screening strategy was successfully challenged in a wide inter-laboratory study on real-life food/feed samples. The positive outcome of this study could result in the adoption of a PSP screening strategy across the EU; a step that would increase harmonization and quality of GMO testing in the EU. Furthermore, this system could represent a model for other official control areas where high-throughput DNA-based detection systems are needed. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Generalized Ensemble Sampling of Enzyme Reaction Free Energy Pathways

    PubMed Central

    Wu, Dongsheng; Fajer, Mikolai I.; Cao, Liaoran; Cheng, Xiaolin; Yang, Wei

    2016-01-01

    Free energy path sampling plays an essential role in computational understanding of chemical reactions, particularly those occurring in enzymatic environments. Among a variety of molecular dynamics simulation approaches, the generalized ensemble sampling strategy is uniquely attractive for the fact that it not only can enhance the sampling of rare chemical events but also can naturally ensure consistent exploration of environmental degrees of freedom. In this review, we plan to provide a tutorial-like tour on an emerging topic: generalized ensemble sampling of enzyme reaction free energy path. The discussion is largely focused on our own studies, particularly ones based on the metadynamics free energy sampling method and the on-the-path random walk path sampling method. We hope that this mini presentation will provide interested practitioners some meaningful guidance for future algorithm formulation and application study. PMID:27498634

  2. Multi-targeted interference-free determination of ten β-blockers in human urine and plasma samples by alternating trilinear decomposition algorithm-assisted liquid chromatography-mass spectrometry in full scan mode: comparison with multiple reaction monitoring.

    PubMed

    Gu, Hui-Wen; Wu, Hai-Long; Yin, Xiao-Li; Li, Yong; Liu, Ya-Juan; Xia, Hui; Zhang, Shu-Rong; Jin, Yi-Feng; Sun, Xiao-Dong; Yu, Ru-Qin; Yang, Peng-Yuan; Lu, Hao-Jie

    2014-10-27

    β-blockers are the first-line therapeutic agents for treating cardiovascular diseases and also a class of prohibited substances in athletic competitions. In this work, a smart strategy that combines three-way liquid chromatography-mass spectrometry (LC-MS) data with second-order calibration method based on alternating trilinear decomposition (ATLD) algorithm was developed for simultaneous determination of ten β-blockers in human urine and plasma samples. This flexible strategy proved to be a useful tool to solve the problems of overlapped peaks and uncalibrated interferences encountered in quantitative LC-MS, and made the multi-targeted interference-free qualitative and quantitative analysis of β-blockers in complex matrices possible. The limits of detection were in the range of 2.0×10(-5)-6.2×10(-3) μg mL(-1), and the average recoveries were between 90 and 110% with standard deviations and average relative prediction errors less than 10%, indicating that the strategy could provide satisfactory prediction results for ten β-blockers in human urine and plasma samples only using liquid chromatography hyphenated single-quadrupole mass spectrometer in full scan mode. To further confirm the feasibility and reliability of the proposed method, the same batch samples were analyzed by multiple reaction monitoring (MRM) method. T-test demonstrated that there are no significant differences between the prediction results of the two methods. Considering the advantages of fast, low-cost, high sensitivity, and no need of complicated chromatographic and tandem mass spectrometric conditions optimization, the proposed strategy is expected to be extended as an attractive alternative method to quantify analyte(s) of interest in complex systems such as cells, biological fluids, food, environment, pharmaceuticals and other complex samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Collaboration During the NASA ABoVE Airborne SAR Campaign: Sampling Strategies Used by NGEE Arctic and Other Partners in Alaska and Western Canada

    NASA Astrophysics Data System (ADS)

    Wullschleger, S. D.; Charsley-Groffman, L.; Baltzer, J. L.; Berg, A. A.; Griffith, P. C.; Jafarov, E. E.; Marsh, P.; Miller, C. E.; Schaefer, K. M.; Siqueira, P.; Wilson, C. J.; Kasischke, E. S.

    2017-12-01

    There is considerable interest in using L- and P-band Synthetic Aperture Radar (SAR) data to monitor variations in aboveground woody biomass, soil moisture, and permafrost conditions in high-latitude ecosystems. Such information is useful for quantifying spatial heterogeneity in surface and subsurface properties, and for model development and evaluation. To conduct these studies, it is desirable that field studies share a common sampling strategy so that the data from multiple sites can be combined and used to analyze variations in conditions across different landscape geomorphologies and vegetation types. In 2015, NASA launched the decade-long Arctic-Boreal Vulnerability Experiment (ABoVE) to study the sensitivity and resilience of these ecosystems to disturbance and environmental change. NASA is able to leverage its remote sensing strengths to collect airborne and satellite observations to capture important ecosystem properties and dynamics across large spatial scales. A critical component of this effort includes collection of ground-based data that can be used to analyze, calibrate and validate remote sensing products. ABoVE researchers at a large number of sites located in important Arctic and boreal ecosystems in Alaska and western Canada are following common design protocols and strategies for measuring soil moisture, thaw depth, biomass, and wetland inundation. Here we elaborate on those sampling strategies as used in the 2017 summer SAR campaign and address the sampling design and measurement protocols for supporting the ABoVE aerial activities. Plot size, transect length, and distribution of replicates across the landscape systematically allowed investigators to optimally sample a site for soil moisture, thaw depth, and organic layer thickness. Specific examples and data sets are described for the Department of Energy's Next-Generation Ecosystem Experiments (NGEE Arctic) project field sites near Nome and Barrow, Alaska. Future airborne and satellite campaigns will be conducted by the NASA ABoVE team and additional collaboration is encouraged.

  4. Comparative Evaluation of Vacuum-based Surface Sampling ...

    EPA Pesticide Factsheets

    Journal Article Following a biological contamination incident, collection of surface samples is necessary to determine the extent and level of contamination, and to deem an area safe for reentry upon decontamination. Current sampling strategies targeting Bacillus anthracis spores prescribe vacuum-based methods for rough and/or porous surfaces. In this study, four commonly-used B. anthracis spore sampling devices (vacuum socks, 37 mm 0.8 µm MCE filter cassettes, 37 mm 0.3 µm PTFE filter cassettes, and 3MTM forensic filters) were comparatively evaluated for their ability to recover surface-associated spores. The vacuum sock device was evaluated at two sampling speeds (slow and fast), resulting in five total methods evaluated. Aerosolized spores (~105 cm-2) of a surrogate Bacillus species (Bacillus atrophaeus) were allowed to settle onto three material types (concrete, carpet, and upholstery). Ten replicate samples were collected using each vacuum method, from each of the three material types. In addition, stainless steel (i.e., nonporous) surfaces inoculated simultaneously were sampled with pre-moistened wipes. Recoveries from wipes of steel surfaces were utilized to verify the inoculum, and to normalize vacuum-based recoveries across trials. Recovery (CFU cm-2) and relative recovery (vacuum recovery/wipe recovery) were determined for each method and material type. Relative recoveries were compared by one-way and three-way ANOVA. Data analysis by one-

  5. Using resampling to assess reliability of audio-visual survey strategies for marbled murrelets at inland forest sites

    USGS Publications Warehouse

    Jodice, Patrick G.R.; Garman, S.L.; Collopy, Michael W.

    2001-01-01

    Marbled Murrelets (Brachyramphus marmoratus) are threatened seabirds that nest in coastal old-growth coniferous forests throughout much of their breeding range. Currently, observer-based audio-visual surveys are conducted at inland forest sites during the breeding season primarily to determine nesting distribution and breeding status and are being used to estimate temporal or spatial trends in murrelet detections. Our goal was to assess the feasibility of using audio-visual survey data for such monitoring. We used an intensive field-based survey effort to record daily murrelet detections at seven survey stations in the Oregon Coast Range. We then used computer-aided resampling techniques to assess the effectiveness of twelve survey strategies with varying scheduling and a sampling intensity of 4-14 surveys per breeding season to estimate known means and SDs of murrelet detections. Most survey strategies we tested failed to provide estimates of detection means and SDs that were within A?20% of actual means and SDs. Estimates of daily detections were, however, frequently estimated to within A?50% of field data with sampling efforts of 14 days/breeding season. Additional resampling analyses with statistically generated detection data indicated that the temporal variability in detection data had a great effect on the reliability of the mean and SD estimates calculated from the twelve survey strategies, while the value of the mean had little effect. Effectiveness at estimating multi-year trends in detection data was similarly poor, indicating that audio-visual surveys might be reliably used to estimate annual declines in murrelet detections of the order of 50% per year.

  6. Self-catalytic growth of unmodified gold nanoparticles as conductive bridges mediated gap-electrical signal transduction for DNA hybridization detection.

    PubMed

    Zhang, Jing; Nie, Huagui; Wu, Zhan; Yang, Zhi; Zhang, Lijie; Xu, Xiangju; Huang, Shaoming

    2014-01-21

    A simple and sensitive gap-electrical biosensor based on self-catalytic growth of unmodified gold nanoparticles (AuNPs) as conductive bridges has been developed for amplifying DNA hybridization events. In this strategy, the signal amplification degree of such conductive bridges is closely related to the variation of the glucose oxidase (GOx)-like catalytic activity of AuNPs upon interaction with single- and double-stranded DNA (ssDNA and dsDNA), respectively. In the presence of target DNA, the obtained dsDNA product cannot adsorb onto the surface of AuNPs due to electrostatic interaction, which makes the unmodified AuNPs exhibit excellent GOx-like catalytic activity. Such catalytic activity can enlarge the diameters of AuNPs in the glucose and HAuCl4 solution and result in a connection between most of the AuNPs and a conductive gold film formation with a dramatically increased conductance. For the control sample, the catalytic activity sites of AuNPs are fully blocked by ssDNA due to the noncovalent interaction between nucleotide bases and AuNPs. Thus, the growth of the assembled AuNPs will not happen and the conductance between microelectrodes will be not changed. Under the optimal experimental conditions, the developed strategy exhibited a sensitive response to target DNA with a high signal-to-noise ratio. Moreover, this strategy was also demonstrated to provide excellent differentiation ability for single-nucleotide polymorphism. Such performances indicated the great potential of this label-free electrical strategy for clinical diagnostics and genetic analysis under real biological sample separation.

  7. Application of mathematical expectation (ME) strategy for detecting low frequency mutations: An example for evaluating 14-bp insertion/deletion (indel) within the bovine PRNP gene.

    PubMed

    Yang, Qing; Zhang, Sihuan; Liu, Liangliang; Cao, Xiukai; Lei, Chuzhao; Qi, Xinglei; Lin, Fengpeng; Qu, Weidong; Qi, Xingshan; Liu, Jiming; Wang, Rongmin; Chen, Hong; Lan, Xianyong

    2016-09-02

    The detection method based on the mathematical expectation (ME) strategy is fast and accuracy for low frequency mutation screening in large samples. Previous studies have found that the 14-bp insertion/deletion (indel) variants of the 3' untranslated region (3' UTR) within bovine PRNP gene have been characterized with low frequency (≤5%) in global breeds outside China, which has not been determined in Chinese cattle breeds yet. Therefore, this study aimed to identify the 14-bp indel within PRNP gene in 5 major Chinese indigenous cattle breeds and to evaluate its associations with phenotypic traits. It was the first time to use ME strategy to detect low frequency indel polymorphisms and found that minor allele frequency was 0.038 (Qinchuan), 0.033 (Xianan), 0.013 (Nanyang), 0.003 (Jiaxian), and zero (Ji'an), respectively. Compared to the traditional detection method by which the sample was screened one by one, the reaction time by using the ME method was decreased 62.5%, 64.9%, 77.6%, 88.9% and 66.4%, respectively. In addition, the 14-bp indel was significantly associated with the growth traits in 2 cattle breeds, with the body length of Qinchuan cattle as well as the body weight and waistline of Xianan cattle. Our results have uncovered that the method based on ME strategy is rapid, reliable, and cost-effective for detecting the low frequency mutation as well as our findings provide a potential valuable theoretical basis for the marker-assisted selection (MAS) in beef cattle.

  8. How Pre-Service Teachers Learn Educational Technology with the Situated Learning Approach

    ERIC Educational Resources Information Center

    Kucuk, Sevda

    2018-01-01

    This research investigated pre-service teachers' motivation, learning strategies, and engagement in a situated learning based educational technology course. In this study, correlational research design was used. The sample of this study was 65 second year science education pre-service teachers. The data were collected through two questionnaires.…

  9. EMI-Sensor Data to Identify Areas of Manure Accumulation on a Feedlot Surface

    USDA-ARS?s Scientific Manuscript database

    A study was initiated to test the validity of using electromagnetic induction (EMI) survey data, a prediction-based sampling strategy and ordinary linear regression modeling to predict spatially variable feedlot surface manure accumulation. A 30 m × 60 m feedlot pen with a central mound was selecte...

  10. Promoting Revision through Blogging in a Second Grade Classroom

    ERIC Educational Resources Information Center

    Young, Chase; Stover, Katie

    2015-01-01

    This study helped the researchers to determine if blogging as a revision strategy in a second grade classroom was beneficial. Nineteen second grade students wrote expository essays and made revisions and edits based on peer feedback in a blogging environment. Six elementary school teachers independently rated the students' writing samples using…

  11. Generative Teaching and Learning of Economic Concepts: A Sample Lesson.

    ERIC Educational Resources Information Center

    Laney, James D.

    1990-01-01

    Presents a scripted lesson plan for intermediate grades, based on M.C. Wittrock's model of generative teaching derived from brain lateralization research. Uses a shopping mall as the setting for hypothetical dilemmas. Offers a combination of verbal and imagined strategies that improve students' economic reasoning and teaches cost-benefit analysis.…

  12. Engineering youth service system infrastructure: Hawaii's continued efforts at large-scale implementation through knowledge management strategies.

    PubMed

    Nakamura, Brad J; Mueller, Charles W; Higa-McMillan, Charmaine; Okamura, Kelsie H; Chang, Jaime P; Slavin, Lesley; Shimabukuro, Scott

    2014-01-01

    Hawaii's Child and Adolescent Mental Health Division provides a unique illustration of a youth public mental health system with a long and successful history of large-scale quality improvement initiatives. Many advances are linked to flexibly organizing and applying knowledge gained from the scientific literature and move beyond installing a limited number of brand-named treatment approaches that might be directly relevant only to a small handful of system youth. This article takes a knowledge-to-action perspective and outlines five knowledge management strategies currently under way in Hawaii. Each strategy represents one component of a larger coordinated effort at engineering a service system focused on delivering both brand-named treatment approaches and complimentary strategies informed by the evidence base. The five knowledge management examples are (a) a set of modular-based professional training activities for currently practicing therapists, (b) an outreach initiative for supporting youth evidence-based practices training at Hawaii's mental health-related professional programs, (c) an effort to increase consumer knowledge of and demand for youth evidence-based practices, (d) a practice and progress agency performance feedback system, and (e) a sampling of system-level research studies focused on understanding treatment as usual. We end by outlining a small set of lessons learned and a longer term vision for embedding these efforts into the system's infrastructure.

  13. Implementing evidence-based medicine in general practice: a focus group based study

    PubMed Central

    Hannes, Karin; Leys, Marcus; Vermeire, Etienne; Aertgeerts, Bert; Buntinx, Frank; Depoorter, Anne-Marie

    2005-01-01

    Background Over the past years concerns are rising about the use of Evidence-Based Medicine (EBM) in health care. The calls for an increase in the practice of EBM, seem to be obstructed by many barriers preventing the implementation of evidence-based thinking and acting in general practice. This study aims to explore the barriers of Flemish GPs (General Practitioners) to the implementation of EBM in routine clinical work and to identify possible strategies for integrating EBM in daily work. Methods We used a qualitative research strategy to gather and analyse data. We organised focus groups between September 2002 and April 2003. The focus group data were analysed using a combined strategy of 'between-case' analysis and 'grounded theory approach'. Thirty-one general practitioners participated in four focus groups. Purposeful sampling was used to recruit participants. Results A basic classification model documents the influencing factors and actors on a micro-, meso- as well as macro-level. Patients, colleagues, competences, logistics and time were identified on the micro-level (the GPs' individual practice), commercial and consumer organisations on the meso-level (institutions, organisations) and health care policy, media and specific characteristics of evidence on the macro-level (policy level and international scientific community). Existing barriers and possible strategies to overcome these barriers were described. Conclusion In order to implement EBM in routine general practice, an integrated approach on different levels needs to be developed. PMID:16153300

  14. Enhancing the nurses' role in healthcare delivery through strategic management: recognizing its importance or not?

    PubMed

    Carney, Marie

    2009-09-01

    To determine the importance of strategy in nursing management and to establish if strategic management has entered the lexicon of nurses' vocabulary. Developing and managing strategy is a critical success factor for health care managers. It remains unclear if nurse managers view strategy development as their role. A review of scholarly International nursing and management literature, available through CINAHL and PUBMED Data Bases was undertaken. The titles of 1063 articles, published between 1997 and 2007 were examined in order to determine the profile of strategy in those titles. Documentary analysis was undertaken on a random sample of 250 of those articles and on the full text of a further 100. Less than 10% of journal titles contained the word strategy. What was presented as strategy was in the majority of cases describing policy, administration or management. Little formal strategy theory was evident. The nursing profession does not appear to have adopted the terms strategy or strategic management to any great extent. Nurse Managers could play a greater role in enhancing healthcare delivery if an understanding of, and acceptance of the importance of strategy in health care delivery was promoted.

  15. Approaches to sampling and case selection in qualitative research: examples in the geography of health.

    PubMed

    Curtis, S; Gesler, W; Smith, G; Washburn, S

    2000-04-01

    This paper focuses on the question of sampling (or selection of cases) in qualitative research. Although the literature includes some very useful discussions of qualitative sampling strategies, the question of sampling often seems to receive less attention in methodological discussion than questions of how data is collected or is analysed. Decisions about sampling are likely to be important in many qualitative studies (although it may not be an issue in some research). There are varying accounts of the principles applicable to sampling or case selection. Those who espouse 'theoretical sampling', based on a 'grounded theory' approach, are in some ways opposed to those who promote forms of 'purposive sampling' suitable for research informed by an existing body of social theory. Diversity also results from the many different methods for drawing purposive samples which are applicable to qualitative research. We explore the value of a framework suggested by Miles and Huberman [Miles, M., Huberman,, A., 1994. Qualitative Data Analysis, Sage, London.], to evaluate the sampling strategies employed in three examples of research by the authors. Our examples comprise three studies which respectively involve selection of: 'healing places'; rural places which incorporated national anti-malarial policies; young male interviewees, identified as either chronically ill or disabled. The examples are used to show how in these three studies the (sometimes conflicting) requirements of the different criteria were resolved, as well as the potential and constraints placed on the research by the selection decisions which were made. We also consider how far the criteria Miles and Huberman suggest seem helpful for planning 'sample' selection in qualitative research.

  16. The mediating role of coping strategy in the association between family functioning and nonsuicidal self-injury among Taiwanese adolescents.

    PubMed

    Ren, Yaxuan; Lin, Min-Pei; Liu, Yin-Han; Zhang, Xu; Wu, Jo Yung-Wei; Hu, Wei-Hsuan; Xu, Sian; You, Jianing

    2018-01-22

    Nock's (2009) integrated theoretical model suggests that both intrapersonal and interpersonal factors contribute to the development of nonsuicidal self-injury (NSSI). Based on this model, the present study examined the roles of family functioning and coping strategy in predicting NSSI, as well as the mediating effect of coping strategy in the relationship between family functioning and NSSI. Gender differences on the associations of these variables were also examined. A sample of 1,989 secondary school students (52.0% females) in Taiwan was assessed by self-report measures of perceived family functioning, coping strategy, and NSSI. Results showed that both family functioning and avoidance/emotion-focused coping strategy predicted NSSI. Additionally, the association between family functioning and NSSI was mediated by avoidance/emotion-focused coping strategy. Gender differences were not found on the associations among these study variables. These data provided evidences that the Nock's (2009) integrated theoretical model may help to explain how coping strategy mediates the effect of family functioning on NSSI. The implications of the findings for future research and intervention were discussed. © 2018 Wiley Periodicals, Inc.

  17. Development of an analytical strategy based on liquid chromatography-high resolution mass spectrometry for measuring perfluorinated compounds in human breast milk: application to the generation of preliminary data regarding perinatal exposure in France.

    PubMed

    Kadar, Hanane; Veyrand, Bruno; Barbarossa, Andrea; Pagliuca, Giampiero; Legrand, Arnaud; Bosher, Cécile; Boquien, Clair-Yves; Durand, Sophie; Monteau, Fabrice; Antignac, Jean-Philippe; Le Bizec, Bruno

    2011-10-01

    Perfluorinated compounds (PFCs) are man-made chemicals for which endocrine disrupting properties and related possible side effects on human health have been reported, particularly in the case of an exposure during the early stages of development, (notably the perinatal period). Existing analytical methods dedicated to PFCs monitoring in food and/or human fluids are currently based on liquid chromatography coupled to tandem mass spectrometry, and were recently demonstrated to present some limitations in terms of sensitivity and/or specificity. An alternative strategy dedicated to the analysis of fourteen PFCs in human breast milk was proposed, based on an effective sample preparation followed by a liquid chromatography coupled to high resolution mass spectrometry measurement (LC-HRMS). This methodology confirmed the high interest for HRMS after negative ionization for such halogenated substances, and finally permitted to reach detection limits around the pg mL(-1) range with an outstanding signal specificity compared to LC-MS/MS. The proposed method was applied to a first set of 30 breast milk samples from French women. The main PFCs detected in all these samples were PFOS and PFOA with respective median values of 74 (range from 24 to 171) and 57 (range from 18 to 102) pg mL(-1), respectively. These exposure data appeared in the same range as other reported values for European countries. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. Dried blood spot measurement: application in tacrolimus monitoring using limited sampling strategy and abbreviated AUC estimation.

    PubMed

    Cheung, Chi Yuen; van der Heijden, Jaques; Hoogtanders, Karin; Christiaans, Maarten; Liu, Yan Lun; Chan, Yiu Han; Choi, Koon Shing; van de Plas, Afke; Shek, Chi Chung; Chau, Ka Foon; Li, Chun Sang; van Hooff, Johannes; Stolk, Leo

    2008-02-01

    Dried blood spot (DBS) sampling and high-performance liquid chromatography tandem-mass spectrometry have been developed in monitoring tacrolimus levels. Our center favors the use of limited sampling strategy and abbreviated formula to estimate the area under concentration-time curve (AUC(0-12)). However, it is inconvenient for patients because they have to wait in the center for blood sampling. We investigated the application of DBS method in tacrolimus level monitoring using limited sampling strategy and abbreviated AUC estimation approach. Duplicate venous samples were obtained at each time point (C(0), C(2), and C(4)). To determine the stability of blood samples, one venous sample was sent to our laboratory immediately. The other duplicate venous samples, together with simultaneous fingerprick blood samples, were sent to the University of Maastricht in the Netherlands. Thirty six patients were recruited and 108 sets of blood samples were collected. There was a highly significant relationship between AUC(0-12), estimated from venous blood samples, and fingerprick blood samples (r(2) = 0.96, P < 0.0001). Moreover, there was an excellent correlation between whole blood venous tacrolimus levels in the two centers (r(2) = 0.97; P < 0.0001). The blood samples were stable after long-distance transport. DBS sampling can be used in centers using limited sampling and abbreviated AUC(0-12) strategy as drug monitoring.

  19. NASA needs a long-term sample return strategy

    NASA Astrophysics Data System (ADS)

    Agee, C.

    Sample return missions, as demonstrated by Apollo, can have a huge payoff for plan- etary science. Beyond NASAAfs current Discovery missions, Stardust and Genesis, there are no future U.S. sample return missions on the books. At this juncture, it would be desirable for NASA to develop a coherent, long-term strategy for sample return missions to prime targets such as Mars, Venus, and other solar system bodies. The roster of missions planned for this decade in NASAAfs Mars Program no longer includes a sample return. Arguments against an early Mars sample return (MSR) in- clude the high cost, high risk, and not knowing the Agright placeAh on the Martian surface to sample. On the other hand, answering many of the key scientific questions about Mars, including the search for life, may require sample return. In lieu of MSR, NASA plans, out to 2009, a mix of orbital and landed missions that will perform re- mote and in-situ science at Mars. One approach to MSR that may lead to success in the opportunities beyond 2009 is a series of simple missions where large rovers and complex instruments are replaced by robust Mars ascent vehicles and lander-based sampling techniques. AgMobilityAh and Agsample diversityAh in these early reconnaissance sample return missions are accomplished by sending each mission to a distinctly different location based on our understanding of Martian geology prior to launch. The expected wealth of knowledge from these simple sample return missions will help guide Mars exploration beyond 2020. Venus sample return (VSR) should also be a high priority in NASAAfs exploration of the solar system. Our understanding of the Venusian surface is fragmentary at best and the mineralogy in unknown. We have no verified meteorites from Venus and thus radiometric ages of the crust do not exist. Venusian science best done on Earth from a VSR would include (1) precise isotopic measurements of atmospheric gases, soil, and rock, (2) age dating of rock, (3) trace element chemistry of soil and rock, (4) charac- terization of very small phases, (5) characterization of complex weathering products, (6) detailed rock mineralogy and petrology.

  20. Enzyme Biosensors for Biomedical Applications: Strategies for Safeguarding Analytical Performances in Biological Fluids

    PubMed Central

    Rocchitta, Gaia; Spanu, Angela; Babudieri, Sergio; Latte, Gavinella; Madeddu, Giordano; Galleri, Grazia; Nuvoli, Susanna; Bagella, Paola; Demartis, Maria Ilaria; Fiore, Vito; Manetti, Roberto; Serra, Pier Andrea

    2016-01-01

    Enzyme-based chemical biosensors are based on biological recognition. In order to operate, the enzymes must be available to catalyze a specific biochemical reaction and be stable under the normal operating conditions of the biosensor. Design of biosensors is based on knowledge about the target analyte, as well as the complexity of the matrix in which the analyte has to be quantified. This article reviews the problems resulting from the interaction of enzyme-based amperometric biosensors with complex biological matrices containing the target analyte(s). One of the most challenging disadvantages of amperometric enzyme-based biosensor detection is signal reduction from fouling agents and interference from chemicals present in the sample matrix. This article, therefore, investigates the principles of functioning of enzymatic biosensors, their analytical performance over time and the strategies used to optimize their performance. Moreover, the composition of biological fluids as a function of their interaction with biosensing will be presented. PMID:27249001

Top